A comprehensive NLP series: From counting words to human-level text understanding
  • Online Event, Beirut
  • Saturday 9 May 2020, 03:00 pm
One of the most essential forms of human intelligence is their ability to communicate and comprehend, where language understanding rises as a fundamental component. It is said that AI will be solved once we are able to get machines to reach a human-level understanding of natural language. Are you interested in seeing how far we’ve come in solving Natural Language Processing (NLP) with Machine Learning?

Zaka is introducing a comprehensive NLP guide composed of a series of 4 ONLINE hands-on workshops where you can understand: (1) the basics of traditional NLP, (2) how text is represented by machines, (3) advanced neural network architectures that evolved the field of NLP, and (4) the novel state-of-the-art Transformer models that are bringing machines closer than ever to human-level understanding.

Register for the workshop series for a full walkthrough of NLP history and learn the latest trends to solving this challenging AI field or choose to participate in a specific session of your preference! 

Dates: Saturday, May 9, 16, 23, 30 
Time: 15h00-18h00 (GMT+3)
LocationONLINE (via Zoom)

Check out the schedule below for more details on the dates and what will be covered in each session.

P.S. Only a basic programming background is required! These are hands-on coding workshops, so you will need to be on your laptops! 
Attend the 4 workshops and earn a certificate of completion!

Event Outline/Schedule

Intro to NLP: A Machine Learning approach Saturday, May 9
- The NLP Roadmap (overview and history)
- Text Preprocessing: Cleaning, Stemming, Lemmatization
- Common NLP tasks
- From words to vectors: Bag-of-Words
- Hands-on coding project

Intro to Word Embeddings Saturday, May 16
- Word2ec
- GloVe
- Visualizing the vector space
- The Embedding Layer
- RNNs for Text Classification
- Hands-on coding project

Beyond Word Embeddings: Advanced architectures for NLP Saturday, May 23
- Bidirectional Layers
- Sequence-to-Sequence
- Attention
- Intro to Contextualised Word Representation
- Intro to Universal Language Models
- Transfer Learning for NLP
- Hands-on coding throughout the session

NLP in the age of Transformers and Sesame Street Saturday, May 30
- Self-Attention
- The Transformer network
- Deep dive into BERT
- Hands-on code for Fine-tuning BERT for text classification


Organized By   Zaka