Commit 4791fc09 by Karsa Zoltán István

pytorch

parent f32ba53d
......@@ -14,9 +14,6 @@ Használatkor a hagyományos python-os interpreter mintájára működik, azaz t
print('Hello Jupyter Notebook')
```
Hello Jupyter Notebook
```python
a = 10
......@@ -27,9 +24,6 @@ a = 10
print('a: ', a)
```
a: 10
## Magic
Van lehetőségünk további beépített parancsok használatára, melyeket most nem részletezek, néhány példát szeretnék mutatni:
......@@ -39,40 +33,17 @@ lsmagic
```
Available line magics:
%alias %alias_magic %autoawait %autocall %automagic %autosave %bookmark %cat %cd %clear %colors %conda %config %connect_info %cp %debug %dhist %dirs %doctest_mode %ed %edit %env %gui %hist %history %killbgscripts %ldir %less %lf %lk %ll %load %load_ext %loadpy %logoff %logon %logstart %logstate %logstop %ls %lsmagic %lx %macro %magic %man %matplotlib %mkdir %more %mv %notebook %page %pastebin %pdb %pdef %pdoc %pfile %pinfo %pinfo2 %pip %popd %pprint %precision %prun %psearch %psource %pushd %pwd %pycat %pylab %qtconsole %quickref %recall %rehashx %reload_ext %rep %rerun %reset %reset_selective %rm %rmdir %run %save %sc %set_env %store %sx %system %tb %time %timeit %unalias %unload_ext %who %who_ls %whos %xdel %xmode
Available cell magics:
%%! %%HTML %%SVG %%bash %%capture %%debug %%file %%html %%javascript %%js %%latex %%markdown %%perl %%prun %%pypy %%python %%python2 %%python3 %%ruby %%script %%sh %%svg %%sx %%system %%time %%timeit %%writefile
Automagic is ON, % prefix IS NOT needed for line magics.
```python
%%HTML
<a href="https://www.vik.bme.hu/">BME VIK Honlap</a>
```
<a href="https://www.vik.bme.hu/">BME VIK Honlap</a>
```latex
%%latex
$\sum_{n=1}^{\infty} \frac{1}{2^{n}} = 1$
```
$\sum_{n=1}^{\infty} \frac{1}{2^{n}} = 1$
## Mandelbrot halmaz
Komplex számokból álló sík, melyre:
$$x_1 = 0$$
......@@ -118,10 +89,6 @@ plt.axis("off")
plt.show()
```
![png](output_11_0.png)
## GPU támogatott Mandelbrot
Helyezzük el a @jit annotációt a számítási függvényekhez. Meg kell jegyezni, hogy az annotáció a pythonos kódot natívra is cseréli, nemcsak a GPU-t használja.
......@@ -168,10 +135,6 @@ plt.show()
```
![png](output_15_0.png)
```python
from numba import jit
import numpy as np
......@@ -200,9 +163,57 @@ if __name__=="__main__":
print("with GPU:", timer()-start)
```
without GPU: 5.453964335843921
with GPU: 0.08208791585639119
## PyTorch
[link](https://pytorch.org/tutorials/beginner/basics/quickstart_tutorial.html)
```python
import torch
from torch import nn
from torch.utils.data import DataLoader
from torchvision import datasets
from torchvision.transforms import ToTensor, Lambda, Compose
import matplotlib.pyplot as plt
```
```python
# Download training data from open datasets.
training_data = datasets.FashionMNIST(
root="data",
train=True,
download=True,
transform=ToTensor(),
)
# Download test data from open datasets.
test_data = datasets.FashionMNIST(
root="data",
train=False,
download=True,
transform=ToTensor(),
)
```
```python
batch_size = 64
# Create data loaders.
train_dataloader = DataLoader(training_data, batch_size=batch_size)
test_dataloader = DataLoader(test_data, batch_size=batch_size)
for X, y in test_dataloader:
print("Shape of X [N, C, H, W]: ", X.shape)
print("Shape of y: ", y.shape, y.dtype)
break
```
Shape of X [N, C, H, W]: torch.Size([64, 1, 28, 28])
Shape of y: torch.Size([64]) torch.int64
_HF befejezni_
## Osztályozás TensorFlow-val
Telepítsük a TensorFlow-t, egyszer majd talán központilag le lesz töltve, így nem kell magunknak leszedni:
......@@ -226,29 +237,3 @@ from tensorflow import keras
from tensorflow.keras import layers
from tensorflow.keras.models import Sequential
```
---------------------------------------------------------------------------
ModuleNotFoundError Traceback (most recent call last)
<ipython-input-1-e2c0ac327373> in <module>
4 import os
5 import PIL
----> 6 import tensorflow as tf
7
8 from tensorflow import keras
ModuleNotFoundError: No module named 'tensorflow'
```python
```
```python
```
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or sign in to comment