Learner Reviews & Feedback for Natural Language Processing with Classification and Vector Spaces by DeepLearning.AI
About the Course
Top reviews
HA
Aug 9, 2020
one of the Best course that i had attented in deeplearnig.ai the last week assignment wasto good to solve which cover up all which we studied in entire course waiting for course 4 of nlp eagerly
PP
Jan 10, 2024
Started off great, but I feel like the more advanced stuff could've been better explained. Regarding the exercises, I felt like the labs often gave too much information that made them all to easy.
601 - 625 of 912 Reviews for Natural Language Processing with Classification and Vector Spaces
By beomseok l
•Jan 8, 2024
Great!
By WLSC
•Mar 1, 2023
great!
By Thành H Đ T
•Oct 15, 2021
thanks
By Prateek S P
•Jan 17, 2021
thanks
By Jeff D
•Nov 8, 2020
Thanks
By Rafael C F d A
•Sep 29, 2020
Great!
By Kamlesh C
•Aug 30, 2020
Thanks
By Qamar A
•Aug 6, 2020
Cool!!
By ilham k
•Aug 16, 2023
bagus
By Mahesh
•Apr 17, 2023
fghrt
By Hemchand C
•Mar 11, 2023
.....
By B21DCCN436 N Q H
•Feb 14, 2023
grate
By Prins K
•Jul 28, 2021
Great
By 克軒廖
•Feb 5, 2021
Nice!
By Kaustubh K
•Oct 16, 2025
GOOD
By Soham J
•Jun 21, 2025
good
By NamTNPSE173434
•Nov 19, 2024
nice
By Efstathios C
•Jul 16, 2024
Good
By 刘世壮
•Dec 4, 2021
good
By GANNA H
•Aug 5, 2021
good
By Khong D T
•Jan 14, 2025
5*
By Ranjeet K
•Mar 14, 2023
no
By Abhinav S
•May 2, 2022
bk
By Dave J
•Jan 2, 2021
Having previously completed the Deep Learning Specialization, I came to this course with the intention of completing the whole NLP specialization, rather than because I was especially interested in the content of this first course from that specialization.
The Deep Learning Specialization sets a high standard of teaching quality and I have to say I found this course is not quite to the same standard. It's pretty good but not as good. The instructors are very knowledgeable, they make the effort to explain each topic clearly and they do a pretty good job of that.
What I felt could be improved is providing context of where each topic fits into the broader picture of both the theory and current practice of NLP. I was often left feeling, why are we spending time on this particular topic? Is this technique used in current practice or is it just of didactic or historical interest? Great teachers always have the broader context in mind and make sure that students see how everything fits into the bigger picture and why it is worth studying.
Although techniques were clearly explained, I felt that the underlying concepts were sometimes less well explained. An example is vector representations of words: we were shown the use of vector arithmetic to find analogies, but without much in the way of explanation of how this is possible. To me, this was the wrong way around: it makes more sense to me to first build an understanding of the representations, then introduce the remarkable result that these representations allow finding analogies.
In this course, sentences are represented as a "bag of words". This is processing natural language in the way a food processor processes food: chopping it up into a word soup. Since one of the most fundamental aspects of language is its structure, this might seem a hopeless approach. However it gives surprisingly good results for some simple tasks such as classifying tweets as having positive or negative sentiment. If you've done course 5 of the Deep Learning Specialization (Sequence Models), this will feel like a step backwards. There's no deep learning in this course. But I signed up for the course knowing that, so I can't criticise it on that basis. I'm taking the view that this course lays the foundations for more advanced and current topics in the subsequent courses in the specialization and I look forward to getting onto those.
The labs and assignments generally work smoothly. There are a few inconsistencies and a couple of the hints were a bit misleading but generally OK. It's a bit paint-by-numbers though, filling in bits of code within functions rather than working out for yourself how to structure the code.