Quote:
Originally Posted by Barry-xlovecam
Could you create a database driven website that could handle 250,000 visitors /day, that is over 10,000 visitors accessing your SQL database with numerous reads and writes hourly?
That said, most of you Internet critics would not be up to the task.
A more realistic critique would be of the healthcare.org team's flawed concept.
You don't cut the yellow tape and allow your website to be overrun and flooded with demand it cannot handle.
They could have divided the website into regional sub-domains better distributing the stress over master/slave SQL servers. If you look at the major search engines somehow they handle these massive volumes, albeit no write SQL transactions, these write transactions, as you should know, are considerably slower than reads.
They could have created a queue of A-Z by state or region. They could have just accepted registrations then emailed the registrants with an appointment date range or a number -- get in line ...
They should have gotten help from IBM, Oracle, even maybe from the development teams at Amazon, Google, e-Bay and the like ... I hope the government owes the consultancy company they hired lots of money -- Guess what? You ain't getting paid the balance 'till yo fix yo shit!
This is a mismanagement of access demand and with more controlled access the back-end faults could have been fixed in a reasonable time and in an orderly fashion.
A big part of the problem is the "I want it now!!!" mentality rampant today.
Some heads should roll over this whole debacle.
|
I agree that it is not as easy as it looks - I think they should have got together a team of geeks from Google, Facebook etc - They could have done this much more efficiently I am sure.....
As far as Search Engines not having to write data - Where do you think that data comes from?..
I would go as far as to say there are more writes than reads - Even if you ignore the bots, analytics for Google itself creates massive amounts of data...