wiki:CloudySummitImperial

Notes on the 2008 meeting at Imperial College London


Conclusions


web hosting

We want to move to a commercial web hosting service. The project can pay $1000 /yr or perhaps more.

The provider cannot use Pay Pal for billing since our procurement card does not allow it.

The trac site lists some providers here

This list of questions can become an email that we would send to the sales department of the vendors to find answers. It should be made as complete as possible.

trac

What version is now used, how soon after a new version is released is it updated?

What addins are supported, can we add our own?

who controls user rights? How?

svn

Is there ViewVC access to the store?

Is more than one store allowed?

Who controls user rights and how?

Can we get write access to the repos through apache?

What bandwidth will we get?

miscellaneous

Are RSS feeds available?

How do we do local back ups of our store? We would want to launch a script in Lexington to bring a backup copy of all content to our site.

How is the web site hosted? Apache?

Is there WebDav access?

Can we use ssh to login and work directly on the site?

Total disk space at least 1 Gb, more is better since WebDav would allow intermediate storage of files

Do we get access to svn/apache logs?

Is https supported?

Possible vendors

webfaction

From Will - http://www.webfaction.com/services/hosting - I seem to remember they being the frontrunner last time we looked into this. They do allow ssh access and for users to install their own software.

I couldn't see immediately which version of SVN they are running by default. However, this thread from half a year ago has general instructions for upgrading to a recent svn version and this seems to be encouraged.

http://forum.webfaction.com/viewtopic.php?id=858

There also seem to be users successfully running Trac 0.11

Webfaction seems significantly cheaper than CVSdude. They accept Visa, MasterCard, Discover, or PayPal.

cvsdude

http://cvsdude.com/

Ryan had mentioned cvsdude - it looks like they have what we want in their team level at http://cvsdude.com/product.pl

svnrepository

Jeremy found this which offers svn, trac and bugzilla pretty cheaply.

http://svnrepository.com/


Parallel Cloudy

The code uses global variables. This is heritage from the 1970s when global variables were considered faster and simpler than passing variables through an interface.

Routines should has global quantities passed to them through their interface so that they have local copies. Routines should not rely on the use of static variables. They should be fully reentrant so that a routine can be called simultaneously by many instances.


Hazy

Maintaining Hazy 1, the documentation of the code's commands, is very labor intensive for one person. We have to change the document over to a method that all developers feel comfortable with editing. This will allow the group to make changes in the parser or command interface.

LaTex seems to be the only viable option. This has the advantage of being a pure text file so that deltas can be committed to the repository. (The word file is binary so deltas are not saved).

We could hire a team of undergraduates to do the word to LaTex conversion.


How to uniquely identify lines

Two solutions were discussed.

The code knows some state information for the upper and lower levels of the transition and could generate a fairly unique string that gave this information. That could be used to locate a particular line. Molecules become tricky. Some high-Z elements are not in LS coupling and cannot be described that way. States can have mixed parentage.

To identify a given state, a hierarchy of specifications are required, each with their own conventions in naming (e.g. chemical element, rotational, vibrational and electronic states, fine structure, gas state vs adsorbed, ...), not all of which will be useful in general. Suggests that each level in the hierarchy will need to be able to interpret its own name format, with some kind of global coordination - suggests a URL-like naming scheme "species/level/state" or whatever.

We could define a set of "nicknames" for lines. This would give a line wavelength and the code would report the intensities of all lines with +/- delta of that wavelength. This is what a telescope does in practice.


Logistics

Aim to start at 9:30 on Thursday 2008 Feb 7. where do we meet at 9:30? At room 741 Will we be able to get in the building without passes?

Sarah Dodman has booked Blackett lab seminar room 741 for the morning (until 12noon) and seminar room 1004 (10th floor) from 2pm for the afternoon

The How to get there page at Imperial and maps

Gary, Peter, and Ryan will spend Thursday night in London. Robin can stay for dinner. When does Roderick leave? Let's plan on dinner that night.


Agenda

The C08 release

how to compile the code on all platforms

getting the current branch fully stable and qualified on all platforms

Bringing newmole onto the trunk

io options

Have option to specify an initial pressure

The command parser, its priority wrt other issues

have the line output include state information to help identify a transition

atomic data

The Nemala atomic/molecular structure options

MHD code API, command parser

Lab astrophysics applications: discuss with Steve/Jerry?

MHD code's needs, discussions with Will Henney and Tom Theuns, need API to obtain derived results of microphysics

papers & projects

The status Cloudy review paper

The x-ray grain code - is this to be published?

advection, cosmological time-dependent recombination

first star formation, cooling by H2

etc

Parallel Cloudy - do it by zone after first iteration?

The current status of the initialization routines

Perl copyright is Larry Wall and others. There is an authors file that has the name of everyone who ever did anything with Perl. Should we do the same?

Keeping Cloudy platform neutral

Please use the subject line on emails - this helps in archiving

Bringing new students into the project, the lesson of 3/4

Cloudy's face to the world

Hazy - how to maintain it. Word is now xml and standardized. Could this be used by all developers? We must have a method of keeping Hazy parallel with the code. The code does not exist until it is documented. (And none of that is doing anybody any good until it is published).

Hazy - how to designate trial or new parts of the code. Currently they are documented with a gray background.

What do we need to do with the trac site before we can point nublado.org to it? Navigation is a cludge. Should we move it to a better host?


Return to main wiki page

Return to nublado.org

Last modified 10 years ago Last modified on 2008-03-05T15:40:20Z