Apple now offers an application programming interface for neural networking

Apple, which has traditionally kept its artificial intelligence research under wraps, is now allowing developers to build neural networks by calling on the company's simple API [application programming interface]. Such networks, which attempt to imitate the way a human brain works, are particularly particularly effective for predicting events when they have a large database of prior examples to draw on. 

In information technology a neural network is defined as a system of programs and data structures that approximates the operation of the human brain. A neural network usually involves a large number of processors operating in parallel, each with its own small sphere of knowledge and access to data in its local memory. 

Typically, a neural network is initially "trained" or fed large amounts of data and rules about data relationships (for example, "A grandfather is older than a person's father"). A program can then tell the network how to behave in response to an external stimulus (for example, to input from a computer user who is interacting with the network) or can initiate activity on its own (within the limits of its access to the external world).

Those developing with Apple's neural networks, called Basic Neural Network Subroutines, won't be able to train on their own data. Instead Apple has “pre-trained” them for certain tasks, and from the documentation the API seems very focused towards image recognition, according to Popular Science. The API will run on macOS, iOS, tvOS, and watchOS, and is optimized for each device's central processing unit.