Network Embedding

AN Vect-LLM

  • Large Language Model Embeddings (LLM) creates a vector (points in multidimensional hyperspace) for words and sentences.

  • LLM enables a computer to "understand" natural language and notice relationships and similarities between words.

AN Ranking

  • Embeddings can also be generated for other types of data (not only from natural language).

  • AN DeepNet produces a similar vector set related to trust, based on factors like statement accuracy, socio-economic similarities, and social media interactions (such as emoji usage or tone of speech).

AN XX-Like

  • Each Machine learning pipeline in the overall analysis uses it’s own set of embeddings to facilitate spotting trends and patterns.

  • The embeddings are specifically targeted at the quick-response type ‘like’ or ‘emoji’ interactions from the AN #SocAuth mobile App.

AN Socio-Network

  • A socio-network refers to a person’s social network, focusing specifically on that individual and their immediate connections. Imagine yourself as the center, with the people you directly interact with forming a surrounding circle. These direct connections represent your ego network. It's like looking at your closest friends, family, and acquaintances to understand your social connections without considering the larger picture of their connections. It's a way to study the relationships that directly involve you in a smaller, more personal scope.

  • Homophily is a fancy term that means "birds of a feather flock together." In simpler words, it describes the tendency of people to be friends or associate with others who share similarities. This similarity could be in terms of interests, hobbies, background, beliefs, or other characteristics. So, when you see people hanging out with others who share similar likes or traits, that's an example of homophily in action.

  • The AN socio-network of trust rankings based on groups identified based on homophily principle creates another embedding space that we call Blau Spaces (described in more detail later in this document).

Last updated