rithwiks commited on
Commit
aaf6321
2 Parent(s): 691e120 da1c960

Merge branch 'main' of https://huggingface.co/datasets/AstroCompress/GBI-16-4D into main

Browse files
Files changed (2) hide show
  1. GBI-16-4D.py +1 -1
  2. README.md +28 -23
GBI-16-4D.py CHANGED
@@ -47,7 +47,7 @@ _REPO_ID = "AstroCompress/GBI-16-4D"
47
  class GBI_16_4D(datasets.GeneratorBasedBuilder):
48
  """GBI-16-4D Dataset"""
49
 
50
- VERSION = datasets.Version("1.0.2")
51
 
52
  BUILDER_CONFIGS = [
53
  datasets.BuilderConfig(
 
47
  class GBI_16_4D(datasets.GeneratorBasedBuilder):
48
  """GBI-16-4D Dataset"""
49
 
50
+ VERSION = datasets.Version("1.0.3")
51
 
52
  BUILDER_CONFIGS = [
53
  datasets.BuilderConfig(
README.md CHANGED
@@ -56,35 +56,16 @@ pip install datasets astropy
56
 
57
  There are two datasets: `tiny` and `full`, each with `train` and `test` splits. The `tiny` dataset has 2 4D images in the `train` and 1 in the `test`. The `full` dataset contains all the images in the `data/` directory.
58
 
59
- ## Use from Huggingface Directly
60
 
61
- To directly use from this data from Huggingface, you'll want to log in on the command line before starting python:
62
 
63
  ```bash
64
- huggingface-cli login
65
- ```
66
-
67
- or
68
-
69
- ```
70
- import huggingface_hub
71
- huggingface_hub.login(token=token)
72
- ```
73
-
74
- Then in your python script:
75
-
76
- ```python
77
- from datasets import load_dataset
78
- dataset = load_dataset("AstroCompress/GBI-16-4D", "tiny")
79
- ds = dataset.with_format("np")
80
  ```
81
 
82
- ## Local Use
83
-
84
- Alternatively, you can clone this repo and use directly without connecting to hf:
85
-
86
  ```bash
87
- git clone https://huggingface.co/datasets/AstroCompress/GBI-16-4D
88
  ```
89
 
90
  Then `cd GBI-16-4D` and start python like:
@@ -102,3 +83,27 @@ ds["test"][0]["image"].shape # -> (55, 5, 800, 800)
102
  ```
103
 
104
  Note of course that it will take a long time to download and convert the images in the local cache for the `full` dataset. Afterward, the usage should be quick as the files are memory-mapped from disk.
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
56
 
57
  There are two datasets: `tiny` and `full`, each with `train` and `test` splits. The `tiny` dataset has 2 4D images in the `train` and 1 in the `test`. The `full` dataset contains all the images in the `data/` directory.
58
 
59
+ ## Local Use (RECOMMENDED)
60
 
61
+ Alternatively, you can clone this repo and use directly without connecting to hf:
62
 
63
  ```bash
64
+ git clone https://huggingface.co/datasets/AstroCompress/GBI-16-4D
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
65
  ```
66
 
 
 
 
 
67
  ```bash
68
+ git lfs pull
69
  ```
70
 
71
  Then `cd GBI-16-4D` and start python like:
 
83
  ```
84
 
85
  Note of course that it will take a long time to download and convert the images in the local cache for the `full` dataset. Afterward, the usage should be quick as the files are memory-mapped from disk.
86
+
87
+
88
+ ## Use from Huggingface Directly
89
+
90
+ To directly use from this data from Huggingface, you'll want to log in on the command line before starting python:
91
+
92
+ ```bash
93
+ huggingface-cli login
94
+ ```
95
+
96
+ or
97
+
98
+ ```
99
+ import huggingface_hub
100
+ huggingface_hub.login(token=token)
101
+ ```
102
+
103
+ Then in your python script:
104
+
105
+ ```python
106
+ from datasets import load_dataset
107
+ dataset = load_dataset("AstroCompress/GBI-16-4D", "tiny")
108
+ ds = dataset.with_format("np")
109
+ ```