
Inhoudsopgave Seti@Home hitparades week 47
- DPC SETI@Home hitparade van 21 november 2010 *
- DPC SETI@Home hitparade van 22 november 2010
- DPC SETI@Home hitparade van 23 november 2010 *
- DPC SETI@Home hitparade van 24 november 2010
- DPC SETI@Home hitparade van 25 november 2010
- DPC SETI@Home hitparade van 26 november 2010
- DPC SETI@Home hitparade van 27 november 2010
Wat is Seti@Home ?
Wat is SETI@home?
SETI@home is de naam van een wetenschappelijk experiment dat computers die via het internet met elkaar verbonden zijn, inzet om buitenaardse intelligentie op te sporen. U kunt hieraan deelnemen middels een gratis computerprogramma dat radiotelescoop-data download en analyseert.
NIEUWS UPDATE
24 Nov 2010 22:49:14 UTC
Informix is running on oscar and is now initializing all of its dbspaces. We hope to start moving the science data over in the first part of next week.
23 Nov 2010 20:59:01 UTC
Okay then - after some extreme DBA this morning carolyn is now the master mysql database server and jocelyn is the replica. So that project is officially DONE! Actually, there's a lot of low-priority cleanup to deal with, but all the main plumbing is working and the projects are back up such as they are.
Now all server side focus is on oscar. By far the most important thing to fix during this major long outage was our science database - getting a new mysql database rolling was just icing on the cake. But I guess we still need to finish making the cake. Most of our i/o bottlenecks over the past few years have been somehow linked to thumper (both as a database and file server) so getting this done is essential before we get back on line.
Jeff found a comprehensive list of missing spikes (which I mentioned yesterday) and will begin inserting those. We'll then eat some turkey, then have an all-hands-on-deck week next week to get oscar going. We simply cannot get back on line before then, and so we're still looking at new workunits being generated a couple weeks from today at the earliest. I guess if we're really lucky it'll be sooner, but highly doubtful. I know we're anxious to get rolling again, but remember that when you're dealing with billions of rows of data (in the form of a terabyte of raw files), each step takes many hours no matter how clever you are or how fast you type. It's also easy to get lost in theoretical maximum speeds, which never take into account (a) the dizzying array of initial preparations before even starting, (b) actual speeds, (c) the many extra steps necessary when being careful (like backing up a database one last time before dropping a table containing a billion rows), and (d) unpredictable software/hardware behavior requiring us to go back to N steps in the cookbook and try again.
- Matt
[ Voor 94% gewijzigd door APClll op 28-11-2010 14:34 ]