This website uses cookies
We use cookies to continuously improve your experience on our site. More info.
Data Science |
|
BCI, short for Brain-Computer Interface, is a technology that enables direct communication between the brain and external devices, such as computers or prosthetic limbs. It allows individuals to control and interact with devices using their brain signals, making it valuable for various applications, including medical and assistive technologies. |
|
A term that describes the large volume of data – both structured and unstructured – that inundates a business on a day-to-day basis. Can be analyzed for insights that lead to better decisions and strategic business moves. |
|
Hadoop, Hive, Pig, Apache HBase, Cassandra, MapReduce (method), Spark. |
|
BigML is a machine learning platform and service that allows users to create, evaluate, and deploy machine learning models for a wide range of applications. It provides tools and resources for data preprocessing, model building, and predictive analytics, making it easier for organizations to leverage machine learning in their processes and decision-making. |
|
A deep learning framework, best used in image classification and segmentation, where speed, modularity and expression are important. Caffe can be implemented in different scale projects, from academic to industrial, as it can process more than 60M images in a day. |
|
ChatGPT is a language model developed by OpenAI, based on the GPT-3.5 architecture. It is designed for natural language understanding and generation, making it capable of engaging in human-like text-based conversations and providing responses in a wide range of contexts. |
|
The process of predicting the class of given data points. Classes are sometimes called as targets/ labels or categories. Classification predictive modeling is the task of approximating a mapping function (f) from input variables (X) to discrete output variables (y). |
|
A Machine Learning technique that involves the grouping of data points. Given a set of data points, we can use a clustering algorithm to classify each data point into a specific group. |
|
The Microsoft Cognitive Toolkit (formerly CNTK) is a back-end framework used for deep learning. |
|
Composite AI refers to the practice of combining multiple artificial intelligence technologies, such as machine learning, natural language processing, computer vision, and others, to create more comprehensive and capable AI systems that can perform a wide range of tasks and make more sophisticated decisions. It's about leveraging the strengths of different AI components to build more intelligent and versatile applications. |
|
Computer vision is a field of computer science that works on enabling computers to see, identify and process images in the same way that human vision does, and then provide appropriate output. |
|
A Java (or at least JVM-based) annotation pipeline framework, which provides most of the common core natural language processing steps, from tokenization through to coreference resolution. |
|
Convolutional neural network (CNN, or ConvNet) is a class of deep neural networks, most commonly applied to analyzing visual imagery. They have applications in image and video recognition, recommender systems, image classification, medical image analysis, and natural language processing. |
|
A process of analyzing data that uses analytical and statistical tools to examine each component of the data provided and discover useful information. |
|
The process of retrieving data from data sources for further data processing or storage. |
|
Data-Driven Customer Experience is an approach that uses data and insights to personalize and enhance the interactions between businesses and their customers. It involves collecting and analyzing data about consumer behavior and preferences to tailor products, services, and marketing efforts to meet individual needs and deliver a more satisfying and relevant customer experience. |
|
Computational process of discovering patterns in large data sets to extract information and transform it into an understandable structure, for further use. Data mining is used for developing and improving marketing strategies, increasing sales and decreasing costs. |
|
A data model is an abstract model that organizes elements of data and standardizes how they relate to one another and to the properties of real-world entities. |
|
Data modeling is the process of documenting a complex software system design as an easily understood diagram, using text and symbols to represent the way data needs to flow. |
|
DataRobot is an artificial intelligence and machine learning platform that helps organizations build, deploy, and manage predictive models. It automates various steps of the data science workflow, making it easier for users to develop machine learning models without extensive programming knowledge. DataRobot offers a range of tools for data preprocessing, feature engineering, model selection, and deployment. |
|
An interdisciplinary study of information sources, what the information represents and ways of turning it into a valuable resource when creating business and IT strategies. It uses methods and techniques of statistics and data analysis to analyze and understand a phenomenon. |
|
A collection of data that is organized into some type of data structure. |
|
A specialized format for organizing and storing data. Serves as the basis for abstract data types. General data structure types include the array, the file, the record, the table, the tree, and so on. |
|
Data visualization is the graphical representation of information and data. By using visual elements like charts, graphs, and maps, data visualization tools provide an accessible way to see and understand trends, outliers, and patterns in data. |
|
A system that pulls together data from many different sources within an organization for reporting and analysis. Used for online analytical processing, which uses complex queries to analyze rather than process transactions. |