Word2Vec finds abstract representations for every word, so called word embeddings

Keywords: Machine Learning, NLP, Word Embeddings, AI, Text Mining

This are low-dimensional vectors (think of a list of 200 or 300 numbers). Once you have those word vectors, you can do nearly-magical math with words! If you take the vectors for King, Man, Woman, you can calculate King – Man + Woman and then you’ll get the vector for: Queen!

https://blog.esciencecenter.nl/king-man-woman-king-9a7fd2935a85

How To Think In A Different Language

Advice, Language learning, Target Language, Thinking, Thinking in a different language

Yuhakko 语학子

One of the most noticeable aspects of getting better at a language is starting to notice your thought-process switching to your target language.

This evolution doesn’t come all of a sudden but rather slowly after months of study in said language.

Yet, some people cannot seem to think in their target language despite learning it for years. Why is that?

I believe there are a few reasons behind this but luckily, those can be arranged and fixed with just a bit of work on yourself. I recommend start with the below 3 actions:

1. Listen to dailytopics

Your life is full of daily actions, repeated over and over. From getting your coffee in the morning, to taking the train, to being annoyed at a certain mail you get, to being happy for a small action, all the way to getting ready to go to bed at night.

Most…

View original post 459 more words