Skip to main content

Utility package for submitting data to the 4DN Data Portal

Project description

Submit 4DN - Data Submitter Tools

Build Status Coverage Status Code Quality PyPI version

The Submit4DN package is written by the 4DN Data Coordination and Integration Center for data submitters from the 4DN Network. Please contact us to get access to the system, or if you have any questions or suggestions. Detailed documentation on data submission can be found at this link

Installing the package

pip install submit4dn

To upgrade to the latest version

pip install submit4dn --upgrade

Troubleshooting

This package is not supported on older Python versions and is supported and tested for versions 3.9 - 3.12. It may work with other python versions but your mileage may vary.

It is recommended to install this package in a virtual environment to avoid dependency clashes.

Problems have been reported on recent MacOS X and Windows versions having to do with the inablity to find libmagic, a C library to check file types that is used by the python-magic library.

eg. ImportError: failed to find libmagic. Check your installation

First thing to try is:

pip uninstall python-magic
pip install python-magic

If that doesn't work one solution that has worked for some from here:

pip uninstall python-magic
pip install python-magic-bin==0.4.14

Others have had success using homebrew to install libmagic:

brew install libmagic
brew link libmagic  (if the link is already created is going to fail, don't worry about that)

Additionally, problems have been reported on Windows when installing Submit4DN inside a virtual environment, due to aws trying to use the global python instead of the python inside the virtual environment.

The workaround, then, because it’s actually OK if aws doesn’t use the python inside the virtual environment, is to just install awscli in the global environment before entering the virtual environment. Or if you discover the problem after you’re in, then go outside, install awscli, and re-enter the virtual environment.

deactivate
pip install awscli
VENV\scripts\activate  # replace VENV with your virtual environment name
aws --version  # this is to test that awscli is now installed correctly

Connecting to the Data Portal

To be able to use the provided tools, you need to generate an AccessKey on the data portal. If you do not yet have access, please contact 4DN Data Wranglers to get an account and learn how to generate and save a key.

Generating data submission forms

To create the data submission excel workbook, you can use get_field_info.

It will accept the following parameters:

    --keyfile        the path to the file where you have stored your access key info (default ~/keypairs.json)
    --key            the name of the key identifier for the access key and secret in your keys file (default=default)
    --type           use for each sheet that you want to add to the excel workbook
    --nodesc         do not add the descriptions in the second line (by default they are added)
    --noenums        do not add the list of options for a field if they are specified (by default they are added)
    --comments       adds any (usually internal) comments together with enums (by default False)
    --outfile        change the default file name "fields.xlsx" to a specified one
    --debug          to add more debugging output
    --noadmin        if you have admin access to 4DN this option lets you generate the sheet as a non-admin user

Examples generating a single sheet:

get_field_info --type Biosample
get_field_info --type Biosample --comments
get_field_info --type Biosample --comments --outfile biosample.xlsx

Example Workbook with all sheets:

get_field_info --outfile MetadataSheets.xlsx

Examples for Workbooks using a preset option:

get_field_info --type HiC --comments --outfile exp_hic_generic.xlsx
get_field_info --type ChIP-seq --comments --outfile exp_chipseq_generic.xlsx
get_field_info --type FISH --comments --outfile exp_fish_generic.xlsx

Current presets include: Hi-C, ChIP-seq, Repli-seq, ATAC-seq, DamID, ChIA-PET, Capture-C, FISH, SPT

Data submission

Please refer to the submission guidelines and become familiar with the metadata structure prior to submission.

After you fill out the data submission forms, you can use import_data to submit the metadata. The method can be used both to create new metadata items and to patch fields of existing items.

	import_data filename.xlsx

Uploading vs Patching

Runnning import_data without one of the flags described below will perform a dry run submission that will include several validation checks. It is strongly recommended to do a dry run prior to actual submission and if necessary work with a Data Wrangler to correct any errors.

If there are uuid, alias, @id, or accession fields in the excel form that match existing entries in the database, you will be asked if you want to PATCH each object. You can use the --patchall flag, if you want to patch ALL objects in your document and ignore that message.

If no object identifiers are found in the document, you need to use --update for POSTing to occur.

Other Helpful Advanced parameters

Normally you are asked to verify the Lab and Award that you are submitting for. In some cases it may be desirable to skip this prompt so a submission can be run by a scheduler or in the background:

--remote is an option that will skip any prompt before submission

However if you submit for more than one Lab or there is more than one Award associated with your lab you will need to specify these values as parameters using --lab and/or --award followed by the uuids for the appropriate items.

Development

Note if you are attempting to run the scripts in the wranglertools directory without installing the package then in order to get the correct sys.path you need to run the scripts from the parent directory using the following command format:

  python -m wranglertools.get_field_info —-type Biosource
	python -m wranglertools.import_data filename.xlsx

pypi page is - https://pypi.python.org/pypi/Submit4DN

Submit4DN is packaged with poetry. New versions can be released and submitted to pypi using poetry publish

Pytest

Every function is tested by pytest implementation. It can be run in terminal in submit4dn folder by:

py.test

Some tests need internet access, and labeled with "webtest" mark.

Some tests have file operations, and labeled with "file_operation" mark.

To run the mark tests, or exclude them from the tests you can use the following commands:

# Run all tests
py.test

# Run only webtest
py.test -m webtest

# Run only tests with file_operation
py.test -m file_operation

# skip tests that use ftp (do this when testing locally)
py.test -m "not ftp"

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

submit4dn-5.0.1.tar.gz (31.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

submit4dn-5.0.1-py3-none-any.whl (30.9 kB view details)

Uploaded Python 3

File details

Details for the file submit4dn-5.0.1.tar.gz.

File metadata

  • Download URL: submit4dn-5.0.1.tar.gz
  • Upload date:
  • Size: 31.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.4.2 CPython/3.11.11 Linux/6.8.0-1021-azure

File hashes

Hashes for submit4dn-5.0.1.tar.gz
Algorithm Hash digest
SHA256 548d5171f9e68554ce75714da998e7890603e5dd60046a5e9a4391d967c8ce45
MD5 b3cebc13937ef8f7710179f253dd55dd
BLAKE2b-256 279cf66584889978c988e1efb118e485e382f9b60bb0e1fba8948f4c4a8d725c

See more details on using hashes here.

File details

Details for the file submit4dn-5.0.1-py3-none-any.whl.

File metadata

  • Download URL: submit4dn-5.0.1-py3-none-any.whl
  • Upload date:
  • Size: 30.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.4.2 CPython/3.11.11 Linux/6.8.0-1021-azure

File hashes

Hashes for submit4dn-5.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 3c52a824f084d33da0fe735707361b093945fd180f1fde5cb378c02008c07007
MD5 3e9b1210f70e1d6398f6da03bab6d56c
BLAKE2b-256 e655d10b2e32205708ddd1d5a06bef5071b2c4df808b7b7e1a62d145ddf0291c

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page