Einstein is the AI platform created by Salesforce and launched in Fall 2016. It has grown in sophistication since then, using customer data to automatically generate models. These models are continually improved by the AI, which analyzes the history of data and decides which factors are most accurate at predicting the behavior of individual customers. As the unit receives more information, it learns which of its models need adjusting, without the need for any intervention by developers.
The experience is designed to make analysis and predictions easily interpretable in the context of a business question. If the model fits your analysis and business context, users can deploy it to generate predictions on new data. Part of this deployment includes automatically generating an analytics extension calculation that can be copied into Tableau.
Nabar: That involves working with engineers and data scientists on the product roadmap, planning what we want to build in the next release and helping design the technical architecture for how to develop these platform capabilities. In particular, I'm very close to the automated Einstein machine learning side of the stack, where we build libraries that automatically create the models for your data and use case. I also talk to customers -- these could be internal or external developers -- to get a better understanding of their needs.
Nabar: Two things. One is doing some of this data cleansing in an automated way. For instance, when we see that there's a particular feature that's highly correlated with another, we can automatically detect and remove it, if necessary. Or, if there's a particular column that doesn't have much variance, we can throw it out. Second, we can surface insights about the models we build, called explainability. This could inform the user of things such as, 'These were the most interesting factors in your models.' This gives the end user surprising results that could cause them to dig deeper into their data.
While many ML solutions deal with processing a handful of very large datasets (a big data problem), our main use case is a small data problem. Namely, to process many thousands of small-to medium sized datasets and generate models for each one of them.
Einstein Discovery conducts statistical checks to confirm the models are valid, and is able to generate answers, explanations, and recommendations in a way that is easy for business users to understand, without having a data scientist on staff. Watch a demo of how Einstein Discovery can be used to identify trends and improve outcomes for drug rehabilitation programs.
With Genie, Einstein AI and Flow automation services can harness the power of hyperscale real-time data to enable more dynamic and responsive actions and engagement. Einstein, which generates over 175 billion predictions every day, can now deliver personalization and predictions based on real-time data. Flow automation, which saves customers over 100 billion hours every month, can now use real-time data to trigger actions automatically.
2. Add a BigQuery sink to your pipeline in order to receive the streaming events. Notice the BigQuery table gets created automatically once the pipeline executes and the first change record is generated. 2b1af7f3a8