The Business And Technology Network

Helping Business Interpret And Use Technology

Published in B&T Latest News 28 March, 2024 by The bizandtech.net Newswire Staff

AutoBNN: Probabilistic time series forecasting with compositional bayesian neural networks

Posted by Urs Köster, Software Engineer, Google Research

Time series problems are ubiquitous, from forecasting weather and traffic patterns to understanding economic trends. Bayesian approaches start with an assumption about the data’s patterns (prior probability), collecting evidence (e.g., new time series data), and continuously updating that assumption to form a posterior probability distribution. Traditional Bayesian approaches like Gaussian processes (GPs) and Structural Time Series are extensively used for modeling time series data, e.g., the commonly used Mauna Loa CO2 dataset. However, they often rely on domain experts to painstakingly select appropriate model components and may be computationally expensive. Alternatives such as neural networks lack interpretability, making it difficult to understand how they generate forecasts, and don’t produce reliable confidence intervals.

To that end, we introduce AutoBNN, a new open-source package written in JAX. AutoBNN automates the discovery of interpretable time series forecasting models, provides high-quality uncertainty estimates, and scales effectively for use on large datasets. We describe how AutoBNN combines the interpretability of traditional probabilistic approaches with the scalability and flexibility of neural networks.

AutoBNN

AutoBNN is based on a line of research that over the past decade has yielded improved predictive accuracy by modeling time series using GPs with learned kernel structures. The kernel function of a GP encodes assumptions about the function being modeled, such as the presence of trends, periodicity or noise. With learned GP kernels, the kernel function is defined compositionally: it is either a base kernel (such as Linear, Quadratic, Periodic, Matérn or ExponentiatedQuadratic) or a composite that combines two or more kernel functions using operators such as Addition, Multiplication, or ChangePoint. This compositional kernel structure serves two related purposes. First, it is simple enough that a user who is an expert about their data, but not necessarily about GPs, can construct a reasonable prior for their time series. Second, techniques like Sequential Monte Carlo can be used for discrete searches over small structures and can output interpretable results.

AutoBNN improves upon these ideas, replacing the GP with Bayesian neural networks (BNNs) while retaining the compositional kernel structure. A BNN is a neural network with a probability distribution over weights rather than a fixed set of weights. This induces a distribution over outputs, capturing uncertainty in the predictions. BNNs bring the following advantages over GPs: First, training large GPs is computationally expensive, and traditional training algorithms scale as the cube of the number of data points in the time series. In contrast, for a fixed width, training a BNN will often be approximately linear in the number of data points. Second, BNNs lend themselves better to GPU and TPU hardware acceleration than GP training operations. Third, compositional BNNs can be easily combined with traditional deep BNNs, which have the ability to do feature discovery. One could imagine “hybrid” architectures, in which users specify a top-level structure of Add(Linear, Periodic, Deep), and the deep BNN is left to learn the contributions from potentially high-dimensional covariate information.

How might one translate a GP with compositional kernels into a BNN then? A single layer neural network will typically converge to a GP as the number of neurons (or “width”) goes to infinity. More recently, researchers have discovered a correspondence in the other direction — many popular GP kernels (such as Matern, ExponentiatedQuadratic, Polynomial or Periodic) can be obtained as infinite-width BNNs with appropriately chosen activation functions and weight distributions. Furthermore, these BNNs remain close to the corresponding GP even when the width is very much less than infinite. For example, the figures below show the difference in the covariance between pairs of observations, and regression results of the true GPs and their corresponding width-10 neural network versions.

Comparison of Gram matrices between true GP kernels (top row) and their width 10 neural network approximations (bottom row).

Comparison of regression results between true GP kernels (top row) and their width 10 neural network approximations (bottom row).

Finally, the translation is completed with BNN analogues of the Addition and Multiplication operators over GPs, and input warping to produce periodic kernels. BNN addition is straightforwardly given by adding the outputs of the component BNNs. BNN multiplication is achieved by multiplying the activations of the hidden layers of the BNNs and then applying a shared dense layer. We are therefore limited to only multiplying BNNs with the same hidden width.

Using AutoBNN

The AutoBNN package is available within Tensorflow Probability. It is implemented in JAX and uses the flax.linen neural network library. It implements all of the base kernels and operators discussed so far (Linear, Quadratic, Matern, ExponentiatedQuadratic, Periodic, Addition, Multiplication) plus one new kernel and three new operators:

  • a OneLayer kernel, a single hidden layer ReLU BNN,
  • a ChangePoint operator that allows smoothly switching between two kernels,
  • a LearnableChangePoint operator which is the same as ChangePoint except position and slope are given prior distributions and can be learnt from the data, and
  • a WeightedSum operator.

WeightedSum combines two or more BNNs with learnable mixing weights, where the learnable weights follow a Dirichlet prior. By default, a flat Dirichlet distribution with concentration 1.0 is used.

WeightedSums allow a “soft” version of structure discovery, i.e., training a linear combination of many possible models at once. In contrast to structure discovery with discrete structures, such as in AutoGP, this allows us to use standard gradient methods to learn structures, rather than using expensive discrete optimization. Instead of evaluating potential combinatorial structures in series, WeightedSum allows us to evaluate them in parallel.

To easily enable exploration, AutoBNN defines a number of model structures that contain either top-level or internal WeightedSums. The names of these models can be used as the first parameter in any of the estimator constructors, and include things like sum_of_stumps (the WeightedSum over all the base kernels) and sum_of_shallow (which adds all possible combinations of base kernels with all operators).

Illustration of the sum_of_stumps model. The bars in the top row show the amount by which each base kernel contributes, and the bottom row shows the function represented by the base kernel. The resulting weighted sum is shown on the right.

The figure below demonstrates the technique of structure discovery on the N374 (a time series of yearly financial data starting from 1949) from the M3 dataset. The six base structures were ExponentiatedQuadratic (which is the same as the Radial Basis Function kernel, or RBF for short), Matern, Linear, Quadratic, OneLayer and Periodic kernels. The figure shows the MAP estimates of their weights over an ensemble of 32 particles. All of the high likelihood particles gave a large weight to the Periodic component, low weights to Linear, Quadratic and OneLayer, and a large weight to either RBF or Matern.

Parallel coordinates plot of the MAP estimates of the base kernel weights over 32 particles. The sum_of_stumps model was trained on the N374 series from the M3 dataset (insert in blue). Darker lines correspond to particles with higher likelihoods.

By using WeightedSums as the inputs to other operators, it is possible to express rich combinatorial structures, while keeping models compact and the number of learnable weights small. As an example, we include the sum_of_products model (illustrated in the figure below) which first creates a pairwise product of two WeightedSums, and then a sum of the two products. By setting some of the weights to zero, we can create many different discrete structures. The total number of possible structures in this model is 216, since there are 16 base kernels that can be turned on or off. All these structures are explored implicitly by training just this one model.

Illustration of the “sum_of_products” model. Each of the four WeightedSums have the same structure as the “sum_of_stumps” model.

We have found, however, that certain combinations of kernels (e.g., the product of Periodic and either the Matern or ExponentiatedQuadratic) lead to overfitting on many datasets. To prevent this, we have defined model classes like sum_of_safe_shallow that exclude such products when performing structure discovery with WeightedSums.

For training, AutoBNN provides AutoBnnMapEstimator and AutoBnnMCMCEstimator to perform MAP and MCMC inference, respectively. Either estimator can be combined with any of the six likelihood functions, including four based on normal distributions with different noise characteristics for continuous data and two based on the negative binomial distribution for count data.

Result from running AutoBNN on the Mauna Loa CO2 dataset in our example colab. The model captures the trend and seasonal component in the data. Extrapolating into the future, the mean prediction slightly underestimates the actual trend, while the 95% confidence interval gradually increases.

To fit a model like in the figure above, all it takes is the following 10 lines of code, using the scikit-learn–inspired estimator interface:

import autobnn as ab

model = ab.operators.Add(
bnns=(ab.kernels.PeriodicBNN(width=50),
ab.kernels.LinearBNN(width=50),
ab.kernels.MaternBNN(width=50)))

estimator = ab.estimators.AutoBnnMapEstimator(
model, ‘normal_likelihood_logistic_noise’, jax.random.PRNGKey(42),
periods=[12])

estimator.fit(my_training_data_xs, my_training_data_ys)
low, mid, high = estimator.predict_quantiles(my_training_data_xs)

Conclusion

AutoBNN provides a powerful and flexible framework for building sophisticated time series prediction models. By combining the strengths of BNNs and GPs with compositional kernels, AutoBNN opens a world of possibilities for understanding and forecasting complex data. We invite the community to try the colab, and leverage this library to innovate and solve real-world challenges.

Acknowledgements

AutoBNN was written by Colin Carroll, Thomas Colthurst, Urs Köster and Srinivas Vasudevan. We would like to thank Kevin Murphy, Brian Patton and Feras Saad for their advice and feedback.

Published in B&T Latest News 28 March, 2024 by The bizandtech.net Newswire Staff

How Embedded Payments Help Businesses Own Key ‘Micro Moments’

In life, and in commerce, it is often the little things that matter most. And as digitization changes the world, those little things are getting both smaller and more important — particularly when it comes to payments.

This, as leveraging embedded finance is emerging as a key pillar of payments innovation strategies for merchants looking to turn the benefits of a differentiated, end-to-end customer experience into a revenue driver.

By embedding financial services directly into their offerings, businesses and merchants can seamlessly integrate payment capabilities, lending solutions and other financial products into key touchpoints from first contact to final purchase.

The benefit comes from the fact that the merchant journey is full of micro moments that influence and sway the ultimate purchase decision of their customers but are inherently unique to each individual transaction pathway. Embedded finance solutions are the furthest thing from one size fits all, and that is what enables them to be so powerful and effective.

By adapting offerings to end-user payment preferences, including personalizations around payment currency, payment method and other factors, merchants can turn payments — which may have previously been viewed as a cost center, or something to be managed for — into an opportunity to generate more revenue and establish repeat business.

After all, convenience and personalization have been instrumental in driving payments innovation, and their impact when driving sustainable business results can’t be overstated.

Read more: Retailers’ Embedded Finance Bets Pay Off

Owning the Payment Process Can Be a Differentiator for Merchants  

Providing a diverse range of payment methods via embedded solutions can help businesses stand out in an increasingly crowded and competitive landscape.

As Tom Randklev, global head of product at CellPoint Digital, told PYMNTS, “There is a lot of success in the SMB market using [embedded finance] to satisfy their customers and present this as an innovation … the whole ecosystem is inherently managed, which creates efficiencies and a delightful customer experience.”

Capabilities like digital wallets, account-to-account (A2A) bank payments and post-and-pre-purchase financing solutions like buy now, pay later (BNPL) are all ways to provide future-fit benefits that drive conversion and inspire repeat purchases.

“Integrated payments have helped increase operational efficiencies, have reduced costs — and have changed the narrative for merchants,” Gigi Beyene, senior vice president of integrated payments at Nuvei, told PYMNTS

Additionally, by eliminating the need for traditional intermediaries, embedded finance solutions enable merchants and businesses to empower their customers make purchases without leaving a website or other digital channel, or even a mobile app.

This attractively positions the merchant relative to peer businesses whose payment solutions may entail a fork in the road, where the end user needs to leave the commerce experience to perform a separate task, such as entering in their payment details. Against the backdrop of contemporary behavioral expectations, these little frictions can add up to lost sales.

“Any financial product or service that has traditionally involved a departure from the user experience, or an additional set of activities, is there for the taking,” Luke Latham, general manager of Australia and New Zealand at Airwallex, told PYMNTS.

See also: How PayFacs Use Payments to Reshape the Digital Economy

The Business Impact of Payments Innovation

Every business runs on payments, and embedded finance empowers merchants to optimize their operations and streamline financial processes.

Research in “The Embedded Finance Ecosystem: Logistics and Wholesale Trade Edition,” a PYMNTS Intelligence and Carat from Fiserv collaboration, finds that a majority of marketplaces (57%) are “highly interested” in further innovating their existing digital wallet offerings.

And by consolidating payment processing, reconciliation, and reporting within a single platform, merchants can simplify back-office operations and reduce administrative overhead. This not only improves efficiency but also enhances visibility into financial performance, enabling merchants to make data-driven decisions that drive business growth.

Whether it’s enabling one-click checkout options, offering flexible financing solutions, or providing personalized recommendations based on transaction history, embedded finance allows merchants to deliver a tailored experience that resonates with their target audience.

“We are seeing an erosion of physical accounts into virtual ones that not only reduce the total cost of ownership … but also allow the unleashing of data and analytics that help personalize pricing, credit decision and marketing offers, and all kinds of recommendations,” Michael Haney, head of product strategy at Galileo Financial Technologies, told PYMNTS.

The post How Embedded Payments Help Businesses Own Key ‘Micro Moments’ appeared first on PYMNTS.com.

Published in B&T Latest News 28 March, 2024 by The bizandtech.net Newswire Staff

Visa and Alaan Partner on Expense Management in Middle East

Visa and Alaan have partnered to provide digital expense management solutions to businesses in the United Arab Emirates (UAE) and Saudi Arabia.

The companies’ five-year strategic alliance brings together the VisaNet global payment processing network and Alaan’s artificial intelligence (AI) spend management system, the companies said in a Wednesday (March 27) press release.

“Together, we are not just offering a spend management solution; we are transforming how businesses in the Middle East manage their finances,” Parthi Duraisamy, CEO at Alaan, said in the release.

Alaan’s platform is used by more than 500 mid-market and enterprise organizations to control their business spend, according to the release. Launched in 2022 and based in the Dubai International Financial Center (DIFC), Alaan plans to expand operations in other countries in the Middle East and North Africa (MENA) region in 2024.

The new partnership with Visa will enable Alaan to enhance the platform’s capabilities with real-time tracking, automated reconciliations and streamlined expense management, the release said.

Alaan will also leverage Visa’s global network to provide seamless international payment capabilities to the businesses using its platform, per the release.

The two companies will also cooperate on joint marketing initiatives to accelerate the adoption of spend management solutions in the Middle East, as well as supporting the economic growth and digital transformation of the region, according to the release.

“At Visa, we are committed to fostering strategic partnerships that enhance and secure the region’s payments ecosystem and deliver innovative corporate solutions that support businesses in their digital transformation,” Saeeda Jaffar, senior vice president and group country manager for GCC (Gulf Cooperation Council) at Visa, said in the release.

Alaan announced its launch of a business cash-back card in August 2022, saying it was the first such card to be launched in the UAE. The company also touted its ability to help businesses do away with expense reports, manual bookkeeping tasks and a need for petty cash.

“Consumers have long had access to such cards in UAE, but that has not been the case for SMEs [small and medium-sized enterprises] and corporates,” Duraisamy said at the time.

The post Visa and Alaan Partner on Expense Management in Middle East appeared first on PYMNTS.com.

Published in B&T Latest News 28 March, 2024 by The bizandtech.net Newswire Staff

Microsoft’s new safety system can catch hallucinations in its customers’ AI apps

Microsoft logo
Illustration: The Verge

Sarah Bird, Microsoft’s chief product officer of responsible AI, tells The Verge in an interview that her team has designed several new safety features that will be easy to use for Azure customers who aren’t hiring groups of red teamers to test the AI services they built. Microsoft says these LLM-powered tools can detect potential vulnerabilities, monitor for hallucinations “that are plausible yet unsupported,” and block malicious prompts in real time for Azure AI customers working with any model hosted on the platform.

“We know that customers don’t all have deep expertise in prompt injection attacks or hateful content, so the evaluation system generates the prompts needed to simulate these types of attacks. Customers can then get a…

Continue reading…

Published in B&T Latest News 28 March, 2024 by The bizandtech.net Newswire Staff

Rooster Teeth to launch final Red vs Blue season as a movie

Rooster Teeth’s Red vs. Blue will return with on final season as a full-length movie, called Restoration, which launches on May 7.Read More

Published in B&T Latest News 28 March, 2024 by The bizandtech.net Newswire Staff

Cathie Wood: Sales of Coinbase shares not ‘dumping’ but active portfolio management

Ark Invest had offloaded 74,291 Coinbase shares worth $20.8 million two days before a Thursday Q&A session.

Published in B&T Latest News 28 March, 2024 by The bizandtech.net Newswire Staff

MineOS aims to illuminate the AI ‘black box’ for enterprise

MineOS enters the high-stakes race for enterprise AI governance with a new module designed to peer inside the black box of AI systems and provide the visibility needed for oversight and control.Read More

Published in B&T Latest News 28 March, 2024 by The bizandtech.net Newswire Staff

Apple’s OLED iPad Pro will reportedly arrive in May

A 12.9-inch iPad Pro (2022) mounted on an Apple smart keyboard, sitting on a desk, with a person tapping on its screen with an Apple Pencil.
Photo by Dan Seifert / The Verge

Apple plans to release a new lineup of iPad Pros with OLED displays in early May, according to a report from Bloomberg’s Mark Gurman. The company is also reportedly planning to launch an iPad Air with a larger 12.9-inch display for the first time.

According to Gurman’s sources, the new iPad Pro models will feature Apple’s in-house M3 chip, along with a revamped Magic Keyboard with a bigger trackpad. The iPad Air, on the other hand, is rumored to come with the last-generation M2 chip and two display sizes: the standard 10.9-inch option and a larger 12.9-inch one. The current iPad Pro models use M2 chips, while the Air has an M1.

It’s been nearly two years since Apple released an updated iPad. Bloomberg says that, though the company…

Continue reading…