Skip to main content

Combine PineAPPL grids and EKOs into FK tables

Project description

pineko = PineAPPL + eko

pineko converts

  • interpolation grids for theory predictions ('grids' for short) in the form of PineAPPL grids, together with
  • Evolution Kernel Operators (EKO) generated by eko

into fast-kernel (FK) tables. The collection of all FK tables constitute the theory predictions for a PDF fit and therefore is often simply called 'theory'.

pineko replaces APFELcomb, which was used up to NNPDF4.0.

Prerequisites

Generating a 'theory', as defined above, requires several files which are described next.

pineko.toml

You need to provide a pineko.toml, that provides all necessary paths to the input and output folders. [DEBUG: Look at the debug example in this repo [1].]

ymldb

You need all files of the ymldb [2]. [DEBUG: Look at the respective load.sh script to load from dom.] This defines the mapping from datasets to FK tables.

Theory Runcards

You need to provide the necessary theory runcards named with their respective theory ID inside the <paths.theory_cards> folder [3].

Default Operator Card

You need to provide a default operator card for eko [4]. [DEBUG: Look at the respective load.sh script to load from dom.]

Grids

pineko does NOT compute grids, which are instead expected input to pineko. There are typically two ways to obtain grids: computing them from scratch with runcards or reusing existing ones.

Generate new Grids with rr

You need to run rr with a given theory runcard and put the generated grid file with the same name inside the <paths.grids>/<theory_id> folder. The name has to match the ymldb which is the case by default.

Inherit Grids from Existing Theory

You can reuse the grids from a different theory by running pineko theory inherit-grids SOURCE_THEORY_ID TARGET_THEORY_ID DATASET1 DATASET2 .... The relation between the source theory and the target theory is non-trivial [5].

Running pineko

Running pineko consists of two steps - each of them being potentially computationally expensive: computing the EKO and convoluting the EKO with the grid.

Computing the EKO

Generating new EKOs

This is a two step process:

  1. Generate the necessary operator cards with pineko theory opcards THEORY_ID DATASET1 DATASET2 ...
  2. Generate the actual EKOs with pineko theory ekos THEORY_ID DATASET1 DATASET2 ...

Inherit EKOs from Existing Theory

You can reuse the EKOs from a different theory by running pineko theory inherit-ekos SOURCE_THEORY_ID TARGET_THEORY_ID DATASET1 DATASET2 .... The relation between the source theory and the target theory is non-trivial [6].

Generating the FK Table

You need to have the EKOs computed in the previous step. Then you can convolute the EKOs with the grids by running pineko theory fks THEORY_ID DATASET1 DATASET2 ...


[1] Actually, instead we should provide a concise description here - but let's wait to be stable first

[2] this is to be replaced by the new CommonData format

[3] this is to be replaced by a binding to the true theory DB

[4] I'm thinking how to improve this, because how could we provide a study on the interpolation accuracy? at the moment there just equal

[5] examples being SV, different evolution settings, etc.

[6] examples being SV, different DIS settings, etc.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pineko-0.1.1.tar.gz (13.3 kB view hashes)

Uploaded Source

Built Distribution

pineko-0.1.1-py3-none-any.whl (15.2 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page