Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[id] cs-230-convolutional-neural-networks #155

Merged
merged 3 commits into from
Oct 12, 2019
Merged

Conversation

gitarja
Copy link
Contributor

@gitarja gitarja commented May 31, 2019

No description provided.

@shervinea
Copy link
Owner

Thank you for your work @gitarja! Now, let's start the review process.

@shervinea shervinea added the reviewer wanted Looking for a reviewer label Jun 3, 2019
@shervinea
Copy link
Owner

Hi @gitarja, would you know someone who could review your work?

@gitarja
Copy link
Contributor Author

gitarja commented Sep 15, 2019

Sorry for the late reply. I just found the reviewer, he will start reviewing my work from next week.

Thank you

@shervinea
Copy link
Owner

That sounds great, thanks for the feedback @gitarja!

<br>


**3. [Overview, Architecture structure]**

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Ikhtisar, Struktur arsitektur]

id/convolutional-neural-networks.md Outdated Show resolved Hide resolved

**4. [Types of layer, Convolution, Pooling, Fully connected]**

&#10230;[Jenis-jenis layer, Covolution, Pooling, Fully connected]

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Jenis-jenis layer, Konvolusi, Penyatuan, Sepenuhnya terhubung]

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ini juga nama konsep, yg Type of layer butuh diubah

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fully connected juga nama konsep Bang?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ia itu juga


**5. [Filter hyperparameters, Dimensions, Stride, Padding]**

&#10230;[Hyperparameters filter, Dimensi, Stride, Padding]

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Penyaring Hiperparameter, Dimensi, Langkah, Pengisi]


**9. [Face verification/recognition, One shot learning, Siamese network, Triplet loss]**

&#10230;[Verifikasi/rekognisi wajah, One shot learning, Siamese network, Loss triplet]

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Pengenal Wajah, One Shot Learning, Jaringan Siamese, Hilangnya Tiga Serangkai]

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Kalau loss triplet gak diterjemahin gak apa2?
Nama teknik juga

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Gapapa Bang, tapi perlu di-italic karena nama asing.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Italic boleh

<br>


**6. [Tuning hyperparameters, Parameter compatibility, Model complexity, Receptive field]**

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Penyetelan hiperparameter, Kesesuaian parameter, model kerumitan, bidang reseptif]

id/convolutional-neural-networks.md Outdated Show resolved Hide resolved
<br>


**8. [Object detection, Types of models, Detection, Intersection over Union, Non-max suppression, YOLO, R-CNN]**

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Deteksi objek, Tipe-tipe model, Deteksi, Persinggungan atas Penggabungan , Non-maks penekanan, YOLO, R-CNN]

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Intersection itu bisa diterjemahi ke Intereksi gak?

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Intereksi apa tuh Bang? Sama kaya interaksi?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Itu kayak dua bagian bertemu, tenganya itu iterseksi

id/convolutional-neural-networks.md Outdated Show resolved Hide resolved

**11. [Computational trick architectures, Generative Adversarial Net, ResNet, Inception Network]**

&#10230;[Arkitektur trik komputasi, Generative Adversarial Net, ResNet, Inception Network]

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Arsitektur trik komputasional, Generative Adversarial Net, ResNet, Inception Network]


**12. Overview**

&#10230;Overview

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ringkasan


**23. Filter hyperparameters**

&#10230;Hyperparameters filter
Copy link

@GunawanTri GunawanTri Sep 29, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Filter hiperparameter


**24. The convolution layer contains filters for which it is important to know the meaning behind its hyperparameters.**

&#10230;Layer convolutional memuat filter yang mana adalah penting untuk mengerti tentang maksud dari hyperparameter filter tersebut.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Lapisan konvolusi mengandung penyaring yang penting untuk dimengerti tentang maksud dari penyaring hiperparameter tersebut.


**26. Filter**

&#10230;Filter

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Penyaring

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ini juga nama istilah, di italic aja ya


**27. Remark: the application of K filters of size F×F results in an output feature map of size O×O×K.**

&#10230;Perlu diperhatikan: aplikasi dari K filter dengan ukuran FxF menhasilkan sebuah keluaran feature map dengan ukuran O×O×K.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Catatan: pengaplikasian dari penyaring F dengan ukuran FxF menghasilkan sebuah keluaran fitur peta dengan ukuran O×O×K.


**34. [Input, Filter, Output]**

&#10230;[Masukan, Filter, Keluaran]

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Masukan, Penyaring, Keluaran]


**35. Remark: often times, Pstart=Pend≜P, in which case we can replace Pstart+Pend by 2P in the formula above.**

&#10230;Perlu diperhatikan: sering, Pstart=Pend≜P, yang mana pada kasus tersebut kita dapat mengganti Pstart+Pend dengan 2P pada formula diatas.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Catatan: kerap kali, Pstart=Pend≜P, pada kasus tersebut kita dapat mengganti Pstart+Pend dengan 2P pada formula di atas.


**38. [One bias parameter per filter, In most cases, S<F, A common choice for K is 2C]**

&#10230;[Satu parameter bias untuk setiap filter, Pada banyak kasus, S<F, Sebuah pilihan yang umum untuk K berinali 2C]
Copy link

@GunawanTri GunawanTri Oct 4, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Satu parameter bias per filter, Pada banyak kasus, S>F, sebuah pilihan umum untuk K adalah 2C]


**41. Receptive field ― The receptive field at layer k is the area denoted Rk×Rk of the input that each pixel of the k-th activation map can 'see'. By calling Fj the filter size of layer j and Si the stride value of layer i and with the convention S0=1, the receptive field at layer k can be computed with the formula:**

&#10230;Receptive field - Receptive field pada layer k adalah area yang dinotasikan RkxRk dari input yang setiap pixel dari k-th activation map dapat 'melihat'. Dengan menulasikan Fj sebagai ukuran filter dari layer j dan Si sebagai nilai stride pada layer i dan dengan konvensi S0=1, receptive field pada layer K dapat dihitung dengan formula berikut:

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Receptive fieldReceptive field pada layer k adalah area yang dinotasikan RkxRk dari masukan yang setiap pixel dari k-th activation map dapat "melihat". Dengan menyebut Fj (sebagai) ukuran penyaring dari lapisan j dan Si (sebagai) nilai stride dari lapisan i dan dengan konvensi 50=1, receptive field pada lapisan k dapat dihitung dengan formula:


**40. [Input is flattened, One bias parameter per neuron, The number of FC neurons is free of structural constraints]**

&#10230;[Input diratakan(menjadi 1D), Satu parameter bias untuk setiap neuron, Jumlah dari neuron FC adalah bebas dari batasan struktural.]

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Masukan diratakan, satu parameter bias untuk setiap neuron, Jumlah dari neuron FC adalah terbebas dari batasan struktural.]


**39. [Pooling operation done channel-wise, In most cases, S=F]**

&#10230;[Operasi pooling dan dilakukan channel-wise, Pada banyak kasus, S=F]

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Operasi Pooling (yang) dilakukan (dengan) channel-wise, Pada banyak kasus, S=F]


**42. In the example below, we have F1=F2=3 and S1=S2=1, which gives R2=1+2⋅1+2⋅1=5.**

&#10230;Pada contoh dibawah ini, kita memiliki F1=f2=3 dan S1=S2=1, yang menghasilkan R2=1+2⋅1+2⋅1=5.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pada contoh di bawah ini, kita memiliki F1=F2=3 dan S1=S2=1, yang menghasilkan R2=1+2-1+2-1=5.


**43. Commonly used activation functions**

&#10230;Fungsi-fungsi aktifasi yang biasa dipakai

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fungsi-fungsi aktivasi yang biasa dipakai


**44. Rectified Linear Unit ― The rectified linear unit layer (ReLU) is an activation function g that is used on all elements of the volume. It aims at introducing non-linearities to the network. Its variants are summarized in the table below:**

&#10230;Rectified Linear Unit - Layer rectified linear unit (ReLU) adalah sebuah fungsi aktifasi g yang digunakan pada seluruh elemen. Penggunaan ReLU adalah untuk memasukan non-linearitas ke network. Variasi-variasi dari ReLU dirangkum pada tebel dibawah ini:

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Rectified Linear Unit ― Lapisan Rectified Linear Unit (ReLU) adalah sebuat fungsi aktivasi g yang digunakan pada seluruh elemen volume. Unit ini bertujuan untuk menempatkan non-linearitas pada jaringan. Variasi-variasi ReLU ini dirangkum pada tabel di bawah ini:


**48. where**

&#10230;Dimana

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Di mana

@gitarja
Copy link
Contributor Author

gitarja commented Oct 10, 2019

@shervinea
Hello, can you change the label?
We have been reviewing this one

Thank you

@shervinea shervinea removed the reviewer wanted Looking for a reviewer label Oct 11, 2019
@shervinea
Copy link
Owner

Hi @gitarja, thanks for your message. I just removed the reviewer wanted label. Please let me know when you pushed all new commits related to the review and you are ready for merge!


**26. Filter**

&#10230;Filter

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Filter


**34. [Input, Filter, Output]**

&#10230;[Masukan, Filter, Keluaran]

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Masukan, Filter, Keluaran]


**46. [Non-linearity complexities biologically interpretable, Addresses dying ReLU issue for negative values, Differentiable everywhere]**

&#10230;[Kompleksitas non-linearitas yang dapat diinterpretasikan secara biologi, Menangani permasalahan dying ReLU yang terjadi untuk nilai negatif, Dapat diturunkan]

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Kompleksitas non-linearitas yang dapat ditafsirkan secara biologi, Menangani permasalahan dying ReLU yang bernilai negatif, Yang dapat dibedakan di mana pun]


**47. Softmax ― The softmax step can be seen as a generalized logistic function that takes as input a vector of scores x∈Rn and outputs a vector of output probability p∈Rn through a softmax function at the end of the architecture. It is defined as follows:**

&#10230;Softmax - Langkah softmax dapat dilihat sebagai fungsi logistik yang digeneralisasi yang mengambil masukan sebuah vektor x∈Rn dan mengeluarkan sebuah probabilitas vektor p∈Rn melalui sebuah fungsi softmax pada akhir arsitektur network. Softmax didefinisikan sebagai berikut:

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Softmax ― Langkah softmax dapat dilihat sebagai sebuah fungsi logistik umum yang berperan sebagai masukan dari nilai skor vektor x∈Rn dan mengualarkan probabilitas produk vektor p∈Rn melalui sebuah fungsi softmax pada akhir dari jaringan arsitektur. Softmax didefinisikan sebagai berikut:


**4. [Types of layer, Convolution, Pooling, Fully connected]**

&#10230;[Jenis-jenis layer, Covolution, Pooling, Fully connected]

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Jenis-jenis layer, Konvolusi, Pooling, Fully connected]


**5. [Filter hyperparameters, Dimensions, Stride, Padding]**

&#10230;[Hyperparameters filter, Dimensi, Stride, Padding]

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Hyperparameters filter, Dimensi, Stride, Padding]

<br>


**8. [Object detection, Types of models, Detection, Intersection over Union, Non-max suppression, YOLO, R-CNN]**

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Intereksi apa tuh Bang? Sama kaya interaksi?


**9. [Face verification/recognition, One shot learning, Siamese network, Triplet loss]**

&#10230;[Verifikasi/rekognisi wajah, One shot learning, Siamese network, Loss triplet]

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Pengenal Wajah, One Shot Learning, Jaringan Siamese, Triplet loss]

<br>


**20. [Max pooling, Average pooling, Each pooling operation selects the maximum value of the current view, Each pooling operation averages the values of the current view]**

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[Max pooling, Average pooling, Setiap operasi pooling mewakili nilai maksimal dari tampilan terbaru, setiap operasi pooling meratakan nilai-nilai dari tampilan terbaru]


**22. Fully Connected (FC) ― The fully connected layer (FC) operates on a flattened input where each input is connected to all neurons. If present, FC layers are usually found towards the end of CNN architectures and can be used to optimize objectives such as class scores.**

&#10230;Fully Connected (FC) - Fully connected layer (FC) menangani sebuah masukan dijadikan 1D dimana setiap elemen masukan terkoneksi keseluruh neuron. Layer FC biasanya ditemukan pada akhir dari arsitektur CNN dan dapat digunakan untuk mengoptimisasi objektif seperti skor kelas (pada kasus klasifikasi).

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fully Connected (FC) - Lapisan Fully connected mengoperasikan sebuah masukan di mana setiap masukan terhubung ke seluruh neuron. Bila ada, lapisan-lapisan FC biasanya ditemukan pada akhir arsitektur CNN dan dapat digunakan untuk mengoptimalkan hasil seperti skor-skor kelas.


**33. Parameter compatibility in convolution layer ― By noting I the length of the input volume size, F the length of the filter, P the amount of zero padding, S the stride, then the output size O of the feature map along that dimension is given by:**

&#10230;Kompabilitas hyperparameter pada layer konvolusion - Dengan menuliskan I sebagai panjang dari ukuran volume masukan, F sebagai panjang dari filter, P sebagai jumlah zero padding, S sebagai stride, maka ukuran keluaran O dari feature map pada dimensi tersebut dituliskan sebagai:

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Kompabilitas parameter pada lapisan konvolusi - Dengan menuliskan I sebagai panjang dari ukuran volume masukan, F sebagai panjang dari filter, P sebagai jumlah dari zero padding, S sebagai stride, maka ukuran keluaran 0 dari feature map pada dimensi tersebut ditandai dengan:


**50. Types of models ― There are 3 main types of object recognition algorithms, for which the nature of what is predicted is different. They are described in the table below:**

&#10230;Tipe-tipe model - Ada tiga tipe utama dari algoritma rekognisi objek, yang mana berbeda pada hal yang diprediksi. Tipe-tipe tersebut dijelaskan pada tabel dibawah ini:
Copy link

@GunawanTri GunawanTri Oct 11, 2019

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Tipe-tipe model - Ada tiga tipe utama dari algoritma rekognisi objek, yang mana hakikat yang diprediksi tersebut berbeda. Tipe-tipe tersebut dijelaskan pada tabel di bawah ini:

Revise the translation according to the comments from the reviewer.
@gitarja
Copy link
Contributor Author

gitarja commented Oct 12, 2019

Morning @shervinea
I have added the new commit, adding the reviews from the reviewer. I think we can merge for this one.

@gitarja
Copy link
Contributor Author

gitarja commented Oct 12, 2019

@GunawanTri
Udah bisa di merge ini.
Kita masuk ke yg Deep Learning aja sama Linear Algebra

@shervinea
Copy link
Owner

Thank you @gitarja and @GunawanTri for all your work! It is very appreciated.

@shervinea shervinea merged commit d03d1c2 into shervinea:master Oct 12, 2019
@GunawanTri
Copy link

@gitarja di merge tuh maksudnya apa Bang?

@gitarja
Copy link
Contributor Author

gitarja commented Oct 14, 2019

@gitarja di merge tuh maksudnya apa Bang?

Mksdnya digabung sama kerjaannya Mas yg diatas. Intinya untuk artikel yg ini sudah selesai

@gitarja
Copy link
Contributor Author

gitarja commented Nov 2, 2019

@shervinea Sorry, can you open this part again?
I need to add some revisions.

@shervinea
Copy link
Owner

Hi @gitarja, thanks for reaching out. Please feel free to open a new PR for any additional revision.

@shervinea shervinea changed the title [id] Convolutional Neural Nets [id] cs-230-convolutional-neural-networks Oct 6, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants