Re: [netcdfgroup] netcdf4 python

Chris,

On Fri, May 27, 2016 at 1:02 PM, Chris Barker <chris.barker@xxxxxxxx> wrote:

>
>
> On Fri, May 27, 2016 at 9:44 AM, Elizabeth A. Fischer <
> elizabeth.fischer@xxxxxxxxxxxx> wrote:
>
>> I install everything, including my Python stack, through Spack.
>>
>
> Interesting -- I'd never seen it before. Looks an awful lot like conda, at
> least in its goals, from a quick read of the README.
>
> Does it deal with the ugly matrix of numpy and python versioning? That's
> one of the killer features of conda.
>

Yes.  Every installed Spack package is uniquely versioned based on the
version of that package, plus all its dependencies.  So... if you have GCC
and Clang, and Python 2.7 and Python 3.5, then you can install 4 versions
of Numpy, etc. --- all simultaneously.  If you upgrade to a new version of
GCC, you can install another 2 versions of Numpy.  One Spack installation
uses this feature to provide 24 versions of a particular package to the
system's users, based on different choices of compiler, MPI, etc.


> NOTE: while conda was born of the Python community, and mostly used there,
> it is very much designed to handle all sorts of packages that have nothing
> to do with python.
>

As far as I can tell, a single Conda recipe builds a single version of a
package.  A repo of Conda recipes will build a single version of your
software stack, analogous to the single set of packages you get with your
Linux distro.

A single Spack recipe can build many versions of a package.  And if the
same nominal version of a package is built with different dependencies,
that is considered a different version as well.

This ability to handle combinatorial complexity is Spack's killer feature.


>
> Does it provide a repository of pre-built packages? That's another of the
> killer features of conda.
>

No.  There are combinatorially many combinations of toolchains, package
versions, etc. that one MIGHT want to use.  For a particular application, a
Spack installation can provide a set of pre-built packages that will work
on other similarly configured systems.

Pre-built binaries aren't very useful for my purposes...

I have some C/Fortran libraries I've built, and use largely in a C/Fortran
context.  They rely on a software stack of about 50 dependencies.  I've
also built Python extensions (using Cython) that allow these libraries to
be scripted from within Python.  In order for my Python extensions to work,
everything must be built with the same software stack.  Not just the same
compiler, but also the same zlib, NetCDF, etc.

As long as you don't need to build any additional binaries, a binary distro
is fine.  Or if you can replicate the software stack used to build that
binary distro and are happy with it, a binary distro is fine.  But if you
need to build your own binaries, and you need control over your software
stack, then binary distros aren't really appropriate.

I tell people if they just want a Python that works, install with
Anaconda.  But if they need to build Python extensions, use Spack.

-- Elizabeth



I need to build everything --- including my Python --- with the same
software stack used to build my libraries.  Spack does this for me.

The problem with pre-built Python binaries is, I would need to use the same
software stack for my C/Fortran library that the binary package builders
used in their stuff.  Not just the same compiler, but also the same NetCDF,
zlib, etc.  The only practical way to ensure

toolchain for my software stack

 it's hard to replicate the toolchain used to build those binaries




  I rely on this to make sure that the software stack used to build my
Python (and Numpy, etc) is the same as the software stack used to build



>
> -CHB
>
>
>
>
>
>
>
>>
>>
>> On Fri, May 27, 2016 at 11:18 AM, Chris Barker - NOAA Federal <
>> chris.barker@xxxxxxxx> wrote:
>>
>>>
>>> Through Yast in openSUSE.
>>>
>>>
>>> A python extension that uses Numpy is complied specifically for both
>>> Python version AND Numpy version. I kind of doubt Yast supports all the
>>> combinations...it may well support one combination that works, but they
>>> tend to be out of date versions.
>>>
>>> Today I could install netcdf4 for python2 using conda, so far it is
>>> working good. My problem is temporarily solved but curious to know whether
>>> I use two numpy abd netcdf versions for python2&3 respectively.
>>>
>>>
>>> The underlying C lib can be the same, but the Python extension is
>>> specific to both Python and Numpy versions. Isn't it nice that conda
>>> handles this for you?
>>>
>>> -CHB
>>>
>>>
>>> Thanks
>>> Nuncio
>>>
>>> On Friday, May 27, 2016, Elizabeth A. Fischer <
>>> elizabeth.fischer@xxxxxxxxxxxx> wrote:
>>>
>>>> How did you install these?
>>>> On May 27, 2016 2:55 AM, "nuncio m" <nuncio.m@xxxxxxxxx> wrote:
>>>>
>>>>> Python 3 numpy is 1.9 and python3-netcdf4 is 1.2.2.  I think its a bit
>>>>> complicated here because I have python2 and python3 in my system.
>>>>> nuncio
>>>>>
>>>>> On Mon, May 23, 2016 at 9:40 PM, Chris Barker <chris.barker@xxxxxxxx>
>>>>> wrote:
>>>>>
>>>>>> On Mon, May 23, 2016 at 4:19 AM, Sudheer Joseph <sjo.india@xxxxxxxxx>
>>>>>> wrote:
>>>>>>
>>>>>>>  You have to install the correct version of numpy which is used by
>>>>>>> NetCDF4 library,
>>>>>>>
>>>>>>
>>>>>> or a correct build of NetCDF4 that matches your numpy version :-)
>>>>>>
>>>>>> I HIGHLY recommend using Anaconda / conda to manage your python
>>>>>> packages -- it specifically supports keeping numpy versions in sync.
>>>>>>
>>>>>> -CHB
>>>>>>
>>>>>>
>>>>>> --
>>>>>>
>>>>>> Christopher Barker, Ph.D.
>>>>>> Oceanographer
>>>>>>
>>>>>> Emergency Response Division
>>>>>> NOAA/NOS/OR&R            (206) 526-6959   voice
>>>>>> 7600 Sand Point Way NE   (206) 526-6329   fax
>>>>>> Seattle, WA  98115       (206) 526-6317   main reception
>>>>>>
>>>>>> Chris.Barker@xxxxxxxx
>>>>>>
>>>>>
>>>>>
>>>>>
>>>>> --
>>>>> Nuncio.M
>>>>> Scientist
>>>>> National Center for Antarctic and Ocean research
>>>>> Head land Sada
>>>>> Vasco da Gamma
>>>>> Goa-403804
>>>>> ph off 91 832 6551117
>>>>> ph: cell 91 9890357423
>>>>>
>>>>>
>>>>> _______________________________________________
>>>>> NOTE: All exchanges posted to Unidata maintained email lists are
>>>>> recorded in the Unidata inquiry tracking system and made publicly
>>>>> available through the web.  Users who post to any of the lists we
>>>>> maintain are reminded to remove any personal information that they
>>>>> do not want to be made public.
>>>>>
>>>>>
>>>>> netcdfgroup mailing list
>>>>> netcdfgroup@xxxxxxxxxxxxxxxx
>>>>> For list information or to unsubscribe,  visit:
>>>>> http://www.unidata.ucar.edu/mailing_lists/
>>>>>
>>>>
>>>
>>> --
>>> Nuncio.M
>>> Scientist
>>> National Center for Antarctic and Ocean research
>>> Head land Sada
>>> Vasco da Gamma
>>> Goa-403804
>>> ph off 91 832 6551117
>>> ph: cell 91 9890357423
>>>
>>>
>>> _______________________________________________
>>> NOTE: All exchanges posted to Unidata maintained email lists are
>>> recorded in the Unidata inquiry tracking system and made publicly
>>> available through the web.  Users who post to any of the lists we
>>> maintain are reminded to remove any personal information that they
>>> do not want to be made public.
>>>
>>>
>>> netcdfgroup mailing list
>>> netcdfgroup@xxxxxxxxxxxxxxxx
>>> For list information or to unsubscribe,  visit:
>>> http://www.unidata.ucar.edu/mailing_lists/
>>>
>>>
>>
>
>
> --
>
> Christopher Barker, Ph.D.
> Oceanographer
>
> Emergency Response Division
> NOAA/NOS/OR&R            (206) 526-6959   voice
> 7600 Sand Point Way NE   (206) 526-6329   fax
> Seattle, WA  98115       (206) 526-6317   main reception
>
> Chris.Barker@xxxxxxxx
>
  • 2016 messages navigation, sorted by:
    1. Thread
    2. Subject
    3. Author
    4. Date
    5. ↑ Table Of Contents
  • Search the netcdfgroup archives: