Complex algorithms may require mathematical, scientific and other forms of specialized expertise, yet, many classes of computation are simple patterns of business process logic. They have to be discovered and prioritized during domain analysis, but most engineers can become comfortable with a complex business domain during development. Much of the challenge in these computations involves the selection of frameworks to support standard patterns of event processing, microservice choreography and orchestration, and the packaging of computations to support DevOps goals such as scalability, security, fault tolerance, geographic distribution and cost/benefit tradeoffs amongst many options.
We have experience with many of the newest serverless technologies such as functions, managed containers, Kubernetes, orchestrators such as AWS Step Functions and Netflix Conductor and Azure App Service. Many of the newest and most challenging computations are available in new serverless offerings. For example, Machine Intelligence, pre-fabricated in the domains of speech, vision, and natural language can be embedded into a custom business process with no investment in infrastructure or deep software development. For example, Azure Cognitive Services, a host of comparable services on AWS, as well as Google Cloud Platform. Machine Learning pipelines can be constructed out of high-level services like AWS Sagemaker which provides a serverless environment where completely custom ML workflows can be deployed using tools like Tensorflow and Keras.
An increasingly important architectural option is to consider moving computations to an "edge" device or service. Running computations on the client device vs. the "core" has, for years, been an important option and an important scalability tool. However there are new options from cloud vendors that allow computations (and data) to be located at the cloud edge, close to the client in an environment more powerful and more secure than computations on the client itself, and more economical than computations in a (more distant) core.
These options have to be weighed carefully in the design stage of an application as they deeply impact the frameworks, programming languages, as well as the entire DevOps tooling needed to manage development and deployment.