Flink Getting Started Tutorial Hosted Flink

Getting Started with Hosted Flink: A Complete Guide

Learn how to deploy and run Apache Flink jobs in minutes with hosted Flink. No infrastructure management, just stream processing.

StreamPark Team

Apache Flink has become the gold standard for real-time stream processing, powering applications at companies like Alibaba, Netflix, and Uber. But deploying and managing Flink infrastructure can be a significant undertaking. That’s where hosted Flink comes in.

Hosted Flink, also known as Flink as a Service, provides fully managed Apache Flink clusters without the operational overhead. Instead of provisioning servers, configuring networks, and managing upgrades, you focus on what matters: building streaming applications.

Key benefits of hosted Flink include:

  • Zero infrastructure management — No servers to provision or maintain
  • Automatic scaling — Resources adjust to your workload
  • Built-in monitoring — Metrics and logging out of the box
  • Quick deployment — Go from code to production in minutes

Getting Started with StreamPark Hosting

StreamPark Hosting makes deploying Flink applications straightforward. Here’s how to get started:

Step 1: Create Your Account

Head to portal.streampark.space and sign up. You’ll be provisioned with a StreamPark workspace automatically.

Step 2: Access Your Dashboard

Once registered, log in to app.streampark.space using the credentials you set up. Your StreamPark dashboard provides:

  • Application Management — Deploy and monitor Flink jobs
  • Cluster Overview — View resource utilization
  • Job History — Track past executions and performance

StreamPark supports multiple deployment methods:

SQL Jobs

The fastest way to get started is with Flink SQL. Create a new application, select “Flink SQL” as the job type, and write your streaming query:

CREATE TABLE orders (
    order_id STRING,
    product_id STRING,
    amount DECIMAL(10, 2),
    order_time TIMESTAMP(3),
    WATERMARK FOR order_time AS order_time - INTERVAL '5' SECOND
) WITH (
    'connector' = 'kafka',
    'topic' = 'orders',
    'properties.bootstrap.servers' = 'kafka:9092',
    'format' = 'json'
);

CREATE TABLE order_stats (
    window_start TIMESTAMP(3),
    window_end TIMESTAMP(3),
    total_orders BIGINT,
    total_amount DECIMAL(10, 2)
) WITH (
    'connector' = 'print'
);

INSERT INTO order_stats
SELECT
    TUMBLE_START(order_time, INTERVAL '1' MINUTE) as window_start,
    TUMBLE_END(order_time, INTERVAL '1' MINUTE) as window_end,
    COUNT(*) as total_orders,
    SUM(amount) as total_amount
FROM orders
GROUP BY TUMBLE(order_time, INTERVAL '1' MINUTE);

JAR Applications

For more complex applications, upload your Flink JAR file:

  1. Navigate to ApplicationsAdd Application
  2. Select Apache Flink as the execution mode
  3. Upload your compiled JAR
  4. Configure parallelism and resource allocation
  5. Click Submit

Step 4: Monitor Your Application

StreamPark provides real-time visibility into your streaming jobs:

  • Metrics Dashboard — Throughput, latency, backpressure
  • Log Viewer — Application logs with search and filtering
  • Checkpoint History — Track checkpoint durations and sizes
  • Alerting — Get notified when jobs fail or degrade

Real-Time Analytics

Process clickstream data, aggregate metrics, and power real-time dashboards.

Event-Driven Architectures

React to events as they happen with exactly-once processing guarantees.

ETL Pipelines

Transform and load data in real-time instead of batch.

Managing Flink yourself means handling cluster provisioning, high availability, state backends, checkpoint storage, security, upgrades, and scaling.

With hosted Flink, all of this is handled for you. You write streaming applications; we run them reliably.

Next Steps

Ready to deploy your first Flink application?

  1. Sign up for StreamPark Hosting — Free trial available
  2. Read the StreamPark docs — Learn about advanced features
  3. Join the community — Get help and share your experience

Stream processing doesn’t have to be complicated. With hosted Flink, you can focus on building applications that deliver real-time value to your business.

S

StreamPark Team

Building the future of stream processing