未找到结果

您的搜索操作未匹配到任何结果。

我们建议您尝试以下操作,以帮助您找到所需内容:

  • 检查关键词搜索的拼写。
  • 使用同义词代替键入的关键词,例如,尝试使用“应用”代替“软件”。
  • 请尝试下方显示的热门搜索之一。
  • 重新搜索。
热门问题

Oracle Cloud Infrastructure Data Flow

Oracle Cloud Infrastructure Data Flow is a fully managed big data service that lets you run Apache Spark applications with no infrastructure to deploy or manage. It lets you deliver big data and AI applications faster because you can focus on your applications without getting distracted by operations.

Product Features

Open all Close all

No Infrastructure to Manage

Now you can focus on making a great Spark application. With Oracle Cloud Infrastructure Data Flow, there is never anything to install, patch, or upgrade. The service handles infrastructure provisioning, network setup, storage, and security, as well as the teardown when the Spark jobs are complete.

Out-of-the-Box Security

Leverage unmatched security from Oracle Cloud Infrastructure. Authentication, authorization, encryption, isolation, and all other critical points are taken care of. Protect your business-critical workloads with the highest levels of security.

Consolidated Insight

Oracle Cloud Infrastructure Data Flow makes it easy to see what all your Spark users are doing by consolidating operational information into a single, searchable, sortable UI. Want to know which job from last week cost the most? It just takes a few clicks to see which job it was—and who ran it.

Managed Output

Oracle Cloud Infrastructure Data Flow enables you to bring analytics to the people who need them. The service automatically—and securely—capture and stores your job's output. Access the output through the UI or REST APIs. Need to know the output of an SQL query you ran last week? It's just one click or API call away.

Simple Debugging and Diagnostics

Tracking down the logs and tools you need to troubleshoot a Spark job can take hours. Oracle Cloud Infrastructure Data Flow consolidates everything you need into one place. So you can quickly access the information you need. From Spark UI to Spark history server and log output, everything is just one click away.