data编程代写、代做Python程序语言
Exercise set 2
Deadline: 11-3-2025 10:00
Note that these networks will run substantially faster on dedicated hardware available via Google Collab, for which you can make a free account, or Surf, for which you should have received accounts.

Exercise 1:

In the paper by LeCun in 1998, entitled Gradient-based learning applied to document recognition, the LeNet-5 convolutional neural network is introduced. This was the first trainable CNN and a landmark paper in the world of deep learning (50k+ citations). From the image above, and from the information provided in the paper, please explain the number of trainable parameters for each layer:
C1: 156
S2: 12
C3: 1,516 (check Table 1!)
S4: 32
C5: 48,120
F6: 10,164

Exercise 2: U-Net
In a U-net, one has to make sure that dimensions of the filters work out. In this exercise, you will calculate the dimensions of different layers within a U-Net. Below, we have drawn a toy U-net

A: Assuming the input image has spatial dimensions 69x69x34(x1 feature), what will be the dimensions of A-C? Are the dimensions of D defined? Why (not)? When padding is applied, assume a value of 1 in each direction.
Tip: if you do not know what stride and zero padding mean, please check google.
B: The U-net shown above is fairly useless, especially as the input and output dimensions do not match. In practice, U-nets are programmed more symmetric, with most modern U-net architectures using Zero Padding at each step, and typically stride 1x1, to ensure resolution of the input and output remains the same. Nevertheless, it is important to think about the dimensions of your input image with respect to the strided/non-zero-padded convolutions and pooling operations to ensure the dimensions match. In the 2D U-net architecture below (taken from the original U-net paper, cited almost 60k times), what would be the minimum size of the input image? Assume 3x3 max pooling (stride 3x3), stride is 1x1 for all conv operations and assume zero-padding is used.

Exercise 3: CNN (week 4)
On Canvas (Surf: Deep Learning for Medical Image Analysis) you can find how to connect to Snellius (surf service).
The exercise 2 of Git contains code (ISIC 2019) that loads the challenge data, and trains a classification network. Study the code on your local machine (e.g. in Spider or Pycharm). You can run a local copy to investigate the data, but probably your computer will not support training the network. Note that your need to download the data and point towards the right folder on your computer in line 51 to run it locally:
#set data location on your local computer. Data can be downloaded from:
# https://surfdrive.surf.nl/files/index.php/s/epjCz4fip1pkWN7
# PW: deeplearningformedicalimaging
data_dir = 'C:\scratch\Surf\Documents\Onderwijs\DeepLearning_MedicalImaging\opgaven\opgave 2\AI-Course_StudentChallenge\data\classification'

You can also chose to run a copy on Google Colab for debugging purposes. But we advise you to run most of the work on Snellius which has dedicated hardware.
Note you will need to log in to W&B first using an interactive slurm session on Snellius
srun --partition=rome --ntasks=1 --cpus-per-task=9 --time=00:10:00 --pty bash -i
module purge
module load 2023
module load PyTorch/2.1.2-foss-2023a-CUDA-12.1.1

source /gpfs/work5/0/prjs1312/venv/bin/activate
wandb login
then ctr+D
exit

Copy your code to Snellius and run it (using slurm) to train the classification network (main_CNN.py). Congratulations, you trained your first network. But how well did it perform?
First, you can try to check the training progress in W&B.

To access the final performance, you will want to specify where the model is being saved during runtime using
python main_CNN.py -- checkpoint_folder_save path/to/folder

and then run the same script, but give the input --checkpoint_folder_path to the command
python main_CNN.py -- checkpoint_folder_path path/to/folder
You will try and improve the network’s performance. Feel free to boast about your performance and compare it to your peers at the Canvas Discussion Boasting CNN and U-net on canvas.
This exercise (A-E) should be handed in as:
-a short (max 2 a4 text) scientific rapport containing at least
oA description of what you implemented and why
oThe results (how did it change the network’s performance)
oInterpretation of the results. Why do you expect certain approaches were better than others?
-Some additional figures/tables are welcome but only those relevant
oGive tables/figures a caption
oRefer to the figure/table from the main text such that we know what we are looking at, why they are relevant, and how to interpret the data
-The code you used (especially when implementing new features)
Note that you will probably not be able to do all suggestions under A-D.
A: Change the network structure in CNNs.py. For example, you can add layers, change convolutional filter sizes, change activation functions or add skip layers. Try developing a network that performs better than the provided one. Please consider which of the metrics are relevant and applied to what data (train/val/test?) for the statements you are making. Please show us the data in easy and understandable fashion. Uploading tens of training curves for all metrics generally is not needed for your message.
B: Tune the training parameters and hyperparameters (e.g. number of epochs, training rate, loss function, optimizer, schedulers) to optimize the network. For most, this can be done by giving additional commands. E.g.
Python main_CNN.py --optimizer_lr --batch_size --optimizer_name --max_epochs
Where is your desired learning rate, your desired batch size etc.
You may also want to try different losses, which may require some programming. Currently loss is defined as:
loss = F.binary_cross_entropy_with_logits(y_hat, y.float())

C: The current data-augmentation consists of simple rotations. You can add additional data augmentation to improve the network’s performance/generalizability. To do so, look at Data_loader.py from line 127 onwards. Currently, this shows the rotation example. You can either adapt this to include additional augmentations, or write your own augmentation. Note that the augmentation is applied to the data at lines 21 and 49 (to the masks) of that same file:
if transform:
self.train_transforms = transforms.Compose([Random_Rotate(0.1), transforms.ToTensor()])
else:
self.train_transforms = transforms.Compose([transforms.ToTensor()])
self.val_transforms = transforms.Compose([transforms.ToTensor()])
and
if transform:
self.train_transforms = transforms.Compose([Random_Rotate_Seg(0.1), ToTensor_Seg()])
else:
self.train_transforms = transforms.Compose([ToTensor_Seg()])
self.val_transforms = transforms.Compose([ToTensor_Seg()])

If you perform augmentation, please add augmented images to your report and discuss them. What do we see? Why?
D. Transfer learn from an existing CNN.


Exercise 4: U-Net (week 5)
This exercise (A-D) should be handed in as a short (max 2 a4 including figures) rapport containing some description of what you implemented and the results (how did it change the network’s performance), alongside the code you used (especially when implementing new features).
Note that you will probably not be able to do all suggestions under A-D.
In the same ISIC challenge, there are segmentation examples. U-net allows segmenting.
A: In CNNs.py, complete the code for a U-net (e.g. using the provided paper in the link, although contrary to the original paper, you may prefer to use padding instead) and run main_Unet.py to train a segmentation network. The network should be trained by running “main_unet.py”.
Similar to exercise 4, optimize the network by changing:
B: Change the network structure. For example, you can add layers, change convolutional filter sizes, change activation funcions or add skip layers. Does this improve the performance?
C: Tune the training parameters and hyperparameters (e.g. number of epochs, training rate, loss function, optimizer, schedulers) to optimize the network.
D: Adapt the data augmentation (you may use the code from the previous exercise).

热门主题

课程名

mktg2509 csci 2600 38170 lng302 csse3010 phas3226 77938 arch1162 engn4536/engn6536 acx5903 comp151101 phl245 cse12 comp9312 stat3016/6016 phas0038 comp2140 6qqmb312 xjco3011 rest0005 ematm0051 5qqmn219 lubs5062m eee8155 cege0100 eap033 artd1109 mat246 etc3430 ecmm462 mis102 inft6800 ddes9903 comp6521 comp9517 comp3331/9331 comp4337 comp6008 comp9414 bu.231.790.81 man00150m csb352h math1041 eengm4100 isys1002 08 6057cem mktg3504 mthm036 mtrx1701 mth3241 eeee3086 cmp-7038b cmp-7000a ints4010 econ2151 infs5710 fins5516 fin3309 fins5510 gsoe9340 math2007 math2036 soee5010 mark3088 infs3605 elec9714 comp2271 ma214 comp2211 infs3604 600426 sit254 acct3091 bbt405 msin0116 com107/com113 mark5826 sit120 comp9021 eco2101 eeen40700 cs253 ece3114 ecmm447 chns3000 math377 itd102 comp9444 comp(2041|9044) econ0060 econ7230 mgt001371 ecs-323 cs6250 mgdi60012 mdia2012 comm221001 comm5000 ma1008 engl642 econ241 com333 math367 mis201 nbs-7041x meek16104 econ2003 comm1190 mbas902 comp-1027 dpst1091 comp7315 eppd1033 m06 ee3025 msci231 bb113/bbs1063 fc709 comp3425 comp9417 econ42915 cb9101 math1102e chme0017 fc307 mkt60104 5522usst litr1-uc6201.200 ee1102 cosc2803 math39512 omp9727 int2067/int5051 bsb151 mgt253 fc021 babs2202 mis2002s phya21 18-213 cege0012 mdia1002 math38032 mech5125 07 cisc102 mgx3110 cs240 11175 fin3020s eco3420 ictten622 comp9727 cpt111 de114102d mgm320h5s bafi1019 math21112 efim20036 mn-3503 fins5568 110.807 bcpm000028 info6030 bma0092 bcpm0054 math20212 ce335 cs365 cenv6141 ftec5580 math2010 ec3450 comm1170 ecmt1010 csci-ua.0480-003 econ12-200 ib3960 ectb60h3f cs247—assignment tk3163 ics3u ib3j80 comp20008 comp9334 eppd1063 acct2343 cct109 isys1055/3412 math350-real math2014 eec180 stat141b econ2101 msinm014/msing014/msing014b fit2004 comp643 bu1002 cm2030
联系我们
EMail: 99515681@qq.com
QQ: 99515681
留学生作业帮-留学生的知心伴侣!
工作时间:08:00-21:00
python代写
微信客服:codinghelp
站长地图