<!-- keep this as a security measure: * Set ALLOWTOPICCHANGE = Main.TWikiAdminGroup,Main.LCGAdminGroup * Set ALLOWTOPICRENAME = Main.TWikiAdminGroup,Main.LCGAdminGroup #uncomment this if you want the page only be viewable by the internal people # * Set ALLOWTOPICVIEW = Main.TWikiAdminGroup,Main.LCGAdminGroup,Main.CMSAdminGroup --> ---+ !!Setup of software in the experimental software area CMS The installations are launched centrally as grid jobs by the CMS administrators. %TOC% ---++ External Information * [[https://twiki.cern.ch/twiki/bin/view/CMS/OfflineSWDistributionPolicy][twiki page on the distribution policy]] * [[https://twiki.cern.ch/twiki/bin/view/CMS/CMSSWdeployment][central page about deployed CMSSW versions]] and info on deployment, links, etc. * Also features links to lists of current and deprecated releases CMS requests that any special setup that a user needs to do, and which departs from standard CMS policy, be recorded on [[https://twiki.cern.ch/twiki/bin/view/CMS/WorkBookRemoteSiteSpecifics][this page of the CERN wiki]]. ---++ Local Site Configuration Files that need to be set up Information on configuration files (TFC and !JobConfig) which need to be located in standard places can be found [[https://twiki.cern.ch/twiki/bin/view/CMS/SWIntTrivial][here]]. This must be set up first, in any new SW area. Every test job will need it and SAM tests may check for it. ---++ Installations are usually done by a central CMS operations team Installations of current releases and deletion of deprecated ones will be done automatically on every CE that hosts CMS queues. Special requests or problems with installations can be directed to the [[https://savannah.cern.ch/support/?func=additem&group=cmscompinfrasup][Savannah cmscompinfrasup tracker]] (you must assign the request to the *cmscompinfrasup-cmsswdeploy* operations team. The CMS operations team will send a installation job via the grid with the VOMS role *cms/lcgadmin* that must get mapped to the *cmssgm* user on the site. The installation is carried out through a private apt/rpm installation in the local software area. Some of the installations are huge, and RPM may experience problems if not run on a 64bit machine. Therefore the CE is configured in a way that only certain nodes have write access to the software area. After successful installation the job will also make the necessary adjustments so that information about the software is published in the grid information system. ---++ Manual installations Refer to the [[https://twiki.cern.ch/twiki/bin/view/CMS/CMSSW_aptinstaller][documentation on the CERN twiki]]. *Take care that the installation takes place on the identical OS version as that of the WNs and the UI.* Some postinstall scripts can get consfused. This notably happened when a direct installation on the NFS server (CentOS) was tried. ---++ Testing the software setup ---+++ Test for correct setting of environment (may be a bit out of date) Ideally you do the following test on a worker node to be sure that you have the correct environment, but it also must work on the UI (the VO_CMS_SW_DIR environment variable must be set correctly). %ICON{warning}% *Part of the older CMMSW is only available in SLC3 compatibility mode* on our SLC4 machines. This requires an additional setting before doing any of the steps below: <verbatim> $> # only do this for getting to the older CMSSW versions $> export SCRAM_ARCH=slc3_ia32_gcc323 </verbatim> A grid job will source the local CMS environment as a first step: <verbatim> $> . $VO_CMS_SW_DIR/cmsset_default.sh Loading site local CMS environment </verbatim> after that, the =scramv1= command should be available. SCRAM does the software building and configuration/environment management for CMS. You should be able to list the available CMS software packages with <verbatim> $> scramv1 list </verbatim> The following sets up a local software development area and runs a sample job for one of the CMSSW versions. Pick one of the versions from the list produced by the =scramv1= command above. <verbatim> CMSSW_VERSION=CMSSW_1_2_3 </verbatim> <verbatim> scramv1 project CMSSW $CMSSW_VERSION # creates a local project work area cd $CMSSW_VERSION/src export CVSROOT=:pserver:anonymous@cmscvs.cern.ch:/cvs_server/repositories/CMSSW cvs login # anonymous access to CMS CVS requires password "98passwd" cvs co -r $CMSSW_VERSION SimG4Core/Application/test cd SimG4Core/Application/test eval `scramv1 runtime -sh` # this sets up the local environment needed by the app in that directory cmsRun -p runP.cfg # runs a standard test </verbatim> ---+++ Verifying the SW installation by RPM tools One can do a check on SW area packages by using an =rpm -V= command in the local environment. Example: Go to a node with access to the SW area and become the install user =cmssgm= Setup the install environment <pre> export VO_CMS_SW_DIR=/experiment-software/cms export APT_VERSION=0.5.15lorg3.2-cms2 export SCRAM_ARCH=slc5_ia32_gcc434 source $VO_CMS_SW_DIR/$SCRAM_ARCH/external/apt/$APT_VERSION/etc/profile.d/init.sh </pre> Now you can use normal RPM commands <pre> rpm -qa ... cms+cmssw+CMSSW_3_8_1-1-1 ... rpm -V --nouser --nogroup --nolinkto --nomtime cms+cmssw+CMSSW_3_8_1-1-1 </pre> *NOTE:* The packages all are relocatable. Postinstall scripts will usually do a big regexp substitution on all kinds of configuration files. So, it is natural to have a number of files with checksum failures. Also, the user and group will differ from site to site. -- Main.DerekFeichtinger - 29 Jan 2008
This topic: LCGTier2
>
WebHome
>
ServiceInformation
>
ExpSWArea
>
CMSExpSWArea
Topic revision: r14 - 2011-01-21 - PabloFernandez
Copyright © 2008-2024 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding TWiki?
Send feedback