Skip to content

Commit 29aada2

Browse files
updated submodule 00 and 01 in AWS
1 parent ebe56e9 commit 29aada2

2 files changed

Lines changed: 39 additions & 56 deletions

File tree

AWS/Submodule_00_background.ipynb

Lines changed: 8 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -109,7 +109,7 @@
109109
"metadata": {},
110110
"outputs": [],
111111
"source": [
112-
"display_quiz(\"../quiz-material/00-pc1.json\")"
112+
"display_quiz(\"Transcriptome-Assembly-Refinement-and-Applications/quiz-material/00-pc1.json\")"
113113
]
114114
},
115115
{
@@ -281,7 +281,7 @@
281281
"metadata": {},
282282
"outputs": [],
283283
"source": [
284-
"display_quiz(\"../quiz-material/00-cp1.json\", shuffle_questions = True)"
284+
"display_quiz(\"Transcriptome-Assembly-Refinement-and-Applications/quiz-material/00-cp1.json\", shuffle_questions = True)"
285285
]
286286
},
287287
{
@@ -407,7 +407,7 @@
407407
"metadata": {},
408408
"outputs": [],
409409
"source": [
410-
"display_quiz(\"../quiz-material/00-cp2.json\", shuffle_questions = True)"
410+
"display_quiz(\"Transcriptome-Assembly-Refinement-and-Applications/quiz-material/00-cp2.json\", shuffle_questions = True)"
411411
]
412412
},
413413
{
@@ -444,7 +444,11 @@
444444
]
445445
}
446446
],
447-
"metadata": {},
447+
"metadata": {
448+
"language_info": {
449+
"name": "python"
450+
}
451+
},
448452
"nbformat": 4,
449453
"nbformat_minor": 5
450454
}

AWS/Submodule_01_prog_setup.ipynb

Lines changed: 31 additions & 52 deletions
Original file line numberDiff line numberDiff line change
@@ -33,9 +33,9 @@
3333
" * Mambaforge (a package manager for bioinformatics tools).\n",
3434
" * `sra-tools`, `perl-dbd-sqlite`, and `perl-dbi` (specific bioinformatics packages).\n",
3535
" * Nextflow (a workflow management system).\n",
36-
" * `gsutil` (for interacting with Google Cloud Storage).\n",
36+
" * `aws s3` (for interacting with AWS S3 Storage).\n",
3737
"\n",
38-
"3. **Download and organize necessary data:** Students will download the TransPi transcriptome assembly software and its associated resources (databases, scripts, configuration files) from a Google Cloud Storage bucket. This includes understanding the directory structure and file organization.\n",
38+
"3. **Download and organize necessary data:** Students will download the TransPi transcriptome assembly software and its associated resources (databases, scripts, configuration files) from an S3 bucket. This includes understanding the directory structure and file organization.\n",
3939
"\n",
4040
"4. **Manage file permissions:** Students will use the `chmod` command to set executable permissions for the necessary files and directories within the TransPi software.\n",
4141
"\n",
@@ -53,7 +53,7 @@
5353
"* **Shell Access:** The ability to execute shell commands from within the Jupyter Notebook environment (using `!` and `%`).\n",
5454
"* **Java Development Kit (JDK):** Required for Nextflow.\n",
5555
"* **Miniforge** A package manager for installing bioinformatics tools.\n",
56-
"* **`gsutil`:** The Google Cloud Storage command-line tool. This is crucial for downloading data from Google Cloud Storage."
56+
"* **`aws s3`:** The AWS command-line tool. This is crucial for downloading data from an S3 storage bucket."
5757
]
5858
},
5959
{
@@ -104,7 +104,7 @@
104104
"## Time to begin!\n",
105105
"\n",
106106
"**Step 1:** To start, make sure that you are in the right starting place with a `cd`.\n",
107-
"> `pwd` prints our current local working directory. Make sure the output from the command is: `/home/jupyter`"
107+
"> `pwd` prints our current local working directory. Make sure the output from the command is: `/home/ec2-user/SageMaker`"
108108
]
109109
},
110110
{
@@ -114,7 +114,7 @@
114114
"metadata": {},
115115
"outputs": [],
116116
"source": [
117-
"%cd /home/jupyter"
117+
"%cd /home/ec2-user/SageMaker"
118118
]
119119
},
120120
{
@@ -147,50 +147,12 @@
147147
"! java -version"
148148
]
149149
},
150-
{
151-
"cell_type": "markdown",
152-
"id": "7b3ffb16-3395-4c01-9774-ee568e815490",
153-
"metadata": {},
154-
"source": [
155-
"**Step 3:** Install Miniforge (a package manager), which is needed to support the information held within the TransPi databases."
156-
]
157-
},
158-
{
159-
"cell_type": "code",
160-
"execution_count": null,
161-
"id": "ac5b204a-f0db-4ceb-bf37-57eca6d77974",
162-
"metadata": {},
163-
"outputs": [],
164-
"source": [
165-
"! curl -L -O https://github.com/conda-forge/miniforge/releases/latest/download/Miniforge3-$(uname)-$(uname -m).sh\n",
166-
"! bash Miniforge3-$(uname)-$(uname -m).sh -b -p $HOME/miniforge"
167-
]
168-
},
169-
{
170-
"cell_type": "markdown",
171-
"id": "c5584e2e",
172-
"metadata": {},
173-
"source": [
174-
"Next, add it to the path."
175-
]
176-
},
177-
{
178-
"cell_type": "code",
179-
"execution_count": null,
180-
"id": "ad030cd1",
181-
"metadata": {},
182-
"outputs": [],
183-
"source": [
184-
"import os\n",
185-
"os.environ[\"PATH\"] += os.pathsep + os.environ[\"HOME\"]+\"/miniforge/bin\""
186-
]
187-
},
188150
{
189151
"cell_type": "markdown",
190152
"id": "7b930ad7",
191153
"metadata": {},
192154
"source": [
193-
"Next, using Miniforge and bioconda, install the tools that will be used in this tutorial."
155+
"**Step 3:** Using Mamba and bioconda, install the tools that will be used in this tutorial."
194156
]
195157
},
196158
{
@@ -239,7 +201,7 @@
239201
"metadata": {},
240202
"outputs": [],
241203
"source": [
242-
"! gsutil -m cp -r gs://nigms-sandbox/nosi-inbremaine-storage/TransPi ./"
204+
"! aws s3 cp --recursive s3://nigms-sandbox/nosi-inbremaine-storage/TransPi ./TransPi"
243205
]
244206
},
245207
{
@@ -249,10 +211,10 @@
249211
"source": [
250212
"<div class=\"alert alert-block alert-success\">\n",
251213
" <i class=\"fa fa-hand-paper-o\" aria-hidden=\"true\"></i>\n",
252-
" <b>Note: </b> gsutil\n",
214+
" <b>Note: </b> aws\n",
253215
"</div>\n",
254216
"\n",
255-
">`gsutil` is a tool allows you to interact with Google Cloud Storage through the command line."
217+
">`aws s3` is a tool allows you to interact with S3 Storage through the command line."
256218
]
257219
},
258220
{
@@ -277,7 +239,7 @@
277239
"metadata": {},
278240
"outputs": [],
279241
"source": [
280-
"! gsutil -m cp -r gs://nigms-sandbox/nosi-inbremaine-storage/resources ./"
242+
"! aws s3 cp --recursive s3://nigms-sandbox/nosi-inbremaine-storage/resources ./resources"
281243
]
282244
},
283245
{
@@ -302,8 +264,7 @@
302264
"> - They can also be stacked so `../../` will take you two layers up.\n",
303265
">\n",
304266
">- If you were to type `!ls ./nextWeek/` it would return the contents of the `nextWeek` directory which is one layer down from the current directory, so it would return `manyThings.txt`.\n",
305-
">\n",
306-
">**This means that in the second line of the code cell above, the file `TransPi.nf` will be copied from the Google Cloud Storage bucket to the current directory.**"
267+
">"
307268
]
308269
},
309270
{
@@ -377,7 +338,7 @@
377338
"outputs": [],
378339
"source": [
379340
"from jupyterquiz import display_quiz\n",
380-
"display_quiz(\"../quiz-material/01-cp1.json\", shuffle_questions = True)"
341+
"display_quiz(\"Transcriptome-Assembly-Refinement-and-Applications/quiz-material/01-cp1.json\", shuffle_questions = True)"
381342
]
382343
},
383344
{
@@ -401,7 +362,25 @@
401362
]
402363
}
403364
],
404-
"metadata": {},
365+
"metadata": {
366+
"kernelspec": {
367+
"display_name": "conda_python3",
368+
"language": "python",
369+
"name": "conda_python3"
370+
},
371+
"language_info": {
372+
"codemirror_mode": {
373+
"name": "ipython",
374+
"version": 3
375+
},
376+
"file_extension": ".py",
377+
"mimetype": "text/x-python",
378+
"name": "python",
379+
"nbconvert_exporter": "python",
380+
"pygments_lexer": "ipython3",
381+
"version": "3.10.16"
382+
}
383+
},
405384
"nbformat": 4,
406385
"nbformat_minor": 5
407386
}

0 commit comments

Comments
 (0)