1. Packages
  2. Azure Native v1
  3. API Docs
  4. videoanalyzer
  5. PipelineTopology
These are the docs for Azure Native v1. We recommenend using the latest version, Azure Native v2.
Azure Native v1 v1.104.0 published on Thursday, Jul 6, 2023 by Pulumi

azure-native.videoanalyzer.PipelineTopology

Explore with Pulumi AI

azure-native-v1 logo
These are the docs for Azure Native v1. We recommenend using the latest version, Azure Native v2.
Azure Native v1 v1.104.0 published on Thursday, Jul 6, 2023 by Pulumi

    Pipeline topology describes the processing steps to be applied when processing content for a particular outcome. The topology should be defined according to the scenario to be achieved and can be reused across many pipeline instances which share the same processing characteristics. For instance, a pipeline topology which captures content from a RTSP camera and archives the content can be reused across many different cameras, as long as the same processing is to be applied across all the cameras. Individual instance properties can be defined through the use of user-defined parameters, which allow for a topology to be parameterized. This allows individual pipelines refer to different values, such as individual cameras’ RTSP endpoints and credentials. Overall a topology is composed of the following:

    • Parameters: list of user defined parameters that can be references across the topology nodes.
    • Sources: list of one or more data sources nodes such as an RTSP source which allows for content to be ingested from cameras.
    • Processors: list of nodes which perform data analysis or transformations.
    • Sinks: list of one or more data sinks which allow for data to be stored or exported to other destinations. API Version: 2021-11-01-preview.

    Example Usage

    Create or update a pipeline topology with an Rtsp source and video sink.

    using System.Collections.Generic;
    using System.Linq;
    using Pulumi;
    using AzureNative = Pulumi.AzureNative;
    
    return await Deployment.RunAsync(() => 
    {
        var pipelineTopology = new AzureNative.VideoAnalyzer.PipelineTopology("pipelineTopology", new()
        {
            AccountName = "testaccount2",
            Description = "Pipeline Topology 1 Description",
            Kind = "Live",
            Parameters = new[]
            {
                new AzureNative.VideoAnalyzer.Inputs.ParameterDeclarationArgs
                {
                    Default = "rtsp://microsoft.com/video.mp4",
                    Description = "rtsp source url parameter",
                    Name = "rtspUrlParameter",
                    Type = "String",
                },
                new AzureNative.VideoAnalyzer.Inputs.ParameterDeclarationArgs
                {
                    Default = "password",
                    Description = "rtsp source password parameter",
                    Name = "rtspPasswordParameter",
                    Type = "SecretString",
                },
            },
            PipelineTopologyName = "pipelineTopology1",
            ResourceGroupName = "testrg",
            Sinks = new[]
            {
                
                {
                    { "inputs", new[]
                    {
                        new AzureNative.VideoAnalyzer.Inputs.NodeInputArgs
                        {
                            NodeName = "rtspSource",
                        },
                    } },
                    { "name", "videoSink" },
                    { "type", "#Microsoft.VideoAnalyzer.VideoSink" },
                    { "videoCreationProperties", new AzureNative.VideoAnalyzer.Inputs.VideoCreationPropertiesArgs
                    {
                        Description = "Parking lot south entrance",
                        SegmentLength = "PT30S",
                        Title = "Parking Lot (Camera 1)",
                    } },
                    { "videoName", "camera001" },
                    { "videoPublishingOptions", new AzureNative.VideoAnalyzer.Inputs.VideoPublishingOptionsArgs
                    {
                        DisableArchive = "false",
                        DisableRtspPublishing = "true",
                    } },
                },
            },
            Sku = new AzureNative.VideoAnalyzer.Inputs.SkuArgs
            {
                Name = "Live_S1",
            },
            Sources = new[]
            {
                new AzureNative.VideoAnalyzer.Inputs.RtspSourceArgs
                {
                    Endpoint = new AzureNative.VideoAnalyzer.Inputs.UnsecuredEndpointArgs
                    {
                        Credentials = new AzureNative.VideoAnalyzer.Inputs.UsernamePasswordCredentialsArgs
                        {
                            Password = "${rtspPasswordParameter}",
                            Type = "#Microsoft.VideoAnalyzer.UsernamePasswordCredentials",
                            Username = "username",
                        },
                        Type = "#Microsoft.VideoAnalyzer.UnsecuredEndpoint",
                        Url = "${rtspUrlParameter}",
                    },
                    Name = "rtspSource",
                    Transport = "Http",
                    Type = "#Microsoft.VideoAnalyzer.RtspSource",
                },
            },
        });
    
    });
    
    package main
    
    import (
    	videoanalyzer "github.com/pulumi/pulumi-azure-native-sdk/videoanalyzer"
    	"github.com/pulumi/pulumi/sdk/v3/go/pulumi"
    )
    
    func main() {
    	pulumi.Run(func(ctx *pulumi.Context) error {
    		_, err := videoanalyzer.NewPipelineTopology(ctx, "pipelineTopology", &videoanalyzer.PipelineTopologyArgs{
    			AccountName: pulumi.String("testaccount2"),
    			Description: pulumi.String("Pipeline Topology 1 Description"),
    			Kind:        pulumi.String("Live"),
    			Parameters: []videoanalyzer.ParameterDeclarationArgs{
    				{
    					Default:     pulumi.String("rtsp://microsoft.com/video.mp4"),
    					Description: pulumi.String("rtsp source url parameter"),
    					Name:        pulumi.String("rtspUrlParameter"),
    					Type:        pulumi.String("String"),
    				},
    				{
    					Default:     pulumi.String("password"),
    					Description: pulumi.String("rtsp source password parameter"),
    					Name:        pulumi.String("rtspPasswordParameter"),
    					Type:        pulumi.String("SecretString"),
    				},
    			},
    			PipelineTopologyName: pulumi.String("pipelineTopology1"),
    			ResourceGroupName:    pulumi.String("testrg"),
    			Sinks: []videoanalyzer.VideoSinkArgs{
    				{
    					Inputs: videoanalyzer.NodeInputArray{
    						{
    							NodeName: pulumi.String("rtspSource"),
    						},
    					},
    					Name: pulumi.String("videoSink"),
    					Type: pulumi.String("#Microsoft.VideoAnalyzer.VideoSink"),
    					VideoCreationProperties: {
    						Description:   pulumi.String("Parking lot south entrance"),
    						SegmentLength: pulumi.String("PT30S"),
    						Title:         pulumi.String("Parking Lot (Camera 1)"),
    					},
    					VideoName: pulumi.String("camera001"),
    					VideoPublishingOptions: {
    						DisableArchive:        pulumi.String("false"),
    						DisableRtspPublishing: pulumi.String("true"),
    					},
    				},
    			},
    			Sku: &videoanalyzer.SkuArgs{
    				Name: pulumi.String("Live_S1"),
    			},
    			Sources: pulumi.AnyArray{
    				videoanalyzer.RtspSource{
    					Endpoint: videoanalyzer.UnsecuredEndpoint{
    						Credentials: videoanalyzer.UsernamePasswordCredentials{
    							Password: "${rtspPasswordParameter}",
    							Type:     "#Microsoft.VideoAnalyzer.UsernamePasswordCredentials",
    							Username: "username",
    						},
    						Type: "#Microsoft.VideoAnalyzer.UnsecuredEndpoint",
    						Url:  "${rtspUrlParameter}",
    					},
    					Name:      "rtspSource",
    					Transport: "Http",
    					Type:      "#Microsoft.VideoAnalyzer.RtspSource",
    				},
    			},
    		})
    		if err != nil {
    			return err
    		}
    		return nil
    	})
    }
    
    package generated_program;
    
    import com.pulumi.Context;
    import com.pulumi.Pulumi;
    import com.pulumi.core.Output;
    import com.pulumi.azurenative.videoanalyzer.PipelineTopology;
    import com.pulumi.azurenative.videoanalyzer.PipelineTopologyArgs;
    import java.util.List;
    import java.util.ArrayList;
    import java.util.Map;
    import java.io.File;
    import java.nio.file.Files;
    import java.nio.file.Paths;
    
    public class App {
        public static void main(String[] args) {
            Pulumi.run(App::stack);
        }
    
        public static void stack(Context ctx) {
            var pipelineTopology = new PipelineTopology("pipelineTopology", PipelineTopologyArgs.builder()        
                .accountName("testaccount2")
                .description("Pipeline Topology 1 Description")
                .kind("Live")
                .parameters(            
                    Map.ofEntries(
                        Map.entry("default", "rtsp://microsoft.com/video.mp4"),
                        Map.entry("description", "rtsp source url parameter"),
                        Map.entry("name", "rtspUrlParameter"),
                        Map.entry("type", "String")
                    ),
                    Map.ofEntries(
                        Map.entry("default", "password"),
                        Map.entry("description", "rtsp source password parameter"),
                        Map.entry("name", "rtspPasswordParameter"),
                        Map.entry("type", "SecretString")
                    ))
                .pipelineTopologyName("pipelineTopology1")
                .resourceGroupName("testrg")
                .sinks(Map.ofEntries(
                    Map.entry("inputs", Map.of("nodeName", "rtspSource")),
                    Map.entry("name", "videoSink"),
                    Map.entry("type", "#Microsoft.VideoAnalyzer.VideoSink"),
                    Map.entry("videoCreationProperties", Map.ofEntries(
                        Map.entry("description", "Parking lot south entrance"),
                        Map.entry("segmentLength", "PT30S"),
                        Map.entry("title", "Parking Lot (Camera 1)")
                    )),
                    Map.entry("videoName", "camera001"),
                    Map.entry("videoPublishingOptions", Map.ofEntries(
                        Map.entry("disableArchive", "false"),
                        Map.entry("disableRtspPublishing", "true")
                    ))
                ))
                .sku(Map.of("name", "Live_S1"))
                .sources(Map.ofEntries(
                    Map.entry("endpoint", Map.ofEntries(
                        Map.entry("credentials", Map.ofEntries(
                            Map.entry("password", "${rtspPasswordParameter}"),
                            Map.entry("type", "#Microsoft.VideoAnalyzer.UsernamePasswordCredentials"),
                            Map.entry("username", "username")
                        )),
                        Map.entry("type", "#Microsoft.VideoAnalyzer.UnsecuredEndpoint"),
                        Map.entry("url", "${rtspUrlParameter}")
                    )),
                    Map.entry("name", "rtspSource"),
                    Map.entry("transport", "Http"),
                    Map.entry("type", "#Microsoft.VideoAnalyzer.RtspSource")
                ))
                .build());
    
        }
    }
    
    import pulumi
    import pulumi_azure_native as azure_native
    
    pipeline_topology = azure_native.videoanalyzer.PipelineTopology("pipelineTopology",
        account_name="testaccount2",
        description="Pipeline Topology 1 Description",
        kind="Live",
        parameters=[
            azure_native.videoanalyzer.ParameterDeclarationArgs(
                default="rtsp://microsoft.com/video.mp4",
                description="rtsp source url parameter",
                name="rtspUrlParameter",
                type="String",
            ),
            azure_native.videoanalyzer.ParameterDeclarationArgs(
                default="password",
                description="rtsp source password parameter",
                name="rtspPasswordParameter",
                type="SecretString",
            ),
        ],
        pipeline_topology_name="pipelineTopology1",
        resource_group_name="testrg",
        sinks=[azure_native.videoanalyzer.VideoSinkResponseArgs(
            inputs=[azure_native.videoanalyzer.NodeInputArgs(
                node_name="rtspSource",
            )],
            name="videoSink",
            type="#Microsoft.VideoAnalyzer.VideoSink",
            video_creation_properties=azure_native.videoanalyzer.VideoCreationPropertiesArgs(
                description="Parking lot south entrance",
                segment_length="PT30S",
                title="Parking Lot (Camera 1)",
            ),
            video_name="camera001",
            video_publishing_options=azure_native.videoanalyzer.VideoPublishingOptionsArgs(
                disable_archive="false",
                disable_rtsp_publishing="true",
            ),
        )],
        sku=azure_native.videoanalyzer.SkuArgs(
            name="Live_S1",
        ),
        sources=[azure_native.videoanalyzer.RtspSourceArgs(
            endpoint=azure_native.videoanalyzer.UnsecuredEndpointArgs(
                credentials=azure_native.videoanalyzer.UsernamePasswordCredentialsArgs(
                    password="${rtspPasswordParameter}",
                    type="#Microsoft.VideoAnalyzer.UsernamePasswordCredentials",
                    username="username",
                ),
                type="#Microsoft.VideoAnalyzer.UnsecuredEndpoint",
                url="${rtspUrlParameter}",
            ),
            name="rtspSource",
            transport="Http",
            type="#Microsoft.VideoAnalyzer.RtspSource",
        )])
    
    import * as pulumi from "@pulumi/pulumi";
    import * as azure_native from "@pulumi/azure-native";
    
    const pipelineTopology = new azure_native.videoanalyzer.PipelineTopology("pipelineTopology", {
        accountName: "testaccount2",
        description: "Pipeline Topology 1 Description",
        kind: "Live",
        parameters: [
            {
                "default": "rtsp://microsoft.com/video.mp4",
                description: "rtsp source url parameter",
                name: "rtspUrlParameter",
                type: "String",
            },
            {
                "default": "password",
                description: "rtsp source password parameter",
                name: "rtspPasswordParameter",
                type: "SecretString",
            },
        ],
        pipelineTopologyName: "pipelineTopology1",
        resourceGroupName: "testrg",
        sinks: [{
            inputs: [{
                nodeName: "rtspSource",
            }],
            name: "videoSink",
            type: "#Microsoft.VideoAnalyzer.VideoSink",
            videoCreationProperties: {
                description: "Parking lot south entrance",
                segmentLength: "PT30S",
                title: "Parking Lot (Camera 1)",
            },
            videoName: "camera001",
            videoPublishingOptions: {
                disableArchive: "false",
                disableRtspPublishing: "true",
            },
        }],
        sku: {
            name: "Live_S1",
        },
        sources: [{
            endpoint: {
                credentials: {
                    password: "${rtspPasswordParameter}",
                    type: "#Microsoft.VideoAnalyzer.UsernamePasswordCredentials",
                    username: "username",
                },
                type: "#Microsoft.VideoAnalyzer.UnsecuredEndpoint",
                url: "${rtspUrlParameter}",
            },
            name: "rtspSource",
            transport: "Http",
            type: "#Microsoft.VideoAnalyzer.RtspSource",
        }],
    });
    
    resources:
      pipelineTopology:
        type: azure-native:videoanalyzer:PipelineTopology
        properties:
          accountName: testaccount2
          description: Pipeline Topology 1 Description
          kind: Live
          parameters:
            - default: rtsp://microsoft.com/video.mp4
              description: rtsp source url parameter
              name: rtspUrlParameter
              type: String
            - default: password
              description: rtsp source password parameter
              name: rtspPasswordParameter
              type: SecretString
          pipelineTopologyName: pipelineTopology1
          resourceGroupName: testrg
          sinks:
            - inputs:
                - nodeName: rtspSource
              name: videoSink
              type: '#Microsoft.VideoAnalyzer.VideoSink'
              videoCreationProperties:
                description: Parking lot south entrance
                segmentLength: PT30S
                title: Parking Lot (Camera 1)
              videoName: camera001
              videoPublishingOptions:
                disableArchive: 'false'
                disableRtspPublishing: 'true'
          sku:
            name: Live_S1
          sources:
            - endpoint:
                credentials:
                  password: ${rtspPasswordParameter}
                  type: '#Microsoft.VideoAnalyzer.UsernamePasswordCredentials'
                  username: username
                type: '#Microsoft.VideoAnalyzer.UnsecuredEndpoint'
                url: ${rtspUrlParameter}
              name: rtspSource
              transport: Http
              type: '#Microsoft.VideoAnalyzer.RtspSource'
    

    Create PipelineTopology Resource

    Resources are created with functions called constructors. To learn more about declaring and configuring resources, see Resources.

    Constructor syntax

    new PipelineTopology(name: string, args: PipelineTopologyArgs, opts?: CustomResourceOptions);
    @overload
    def PipelineTopology(resource_name: str,
                         args: PipelineTopologyArgs,
                         opts: Optional[ResourceOptions] = None)
    
    @overload
    def PipelineTopology(resource_name: str,
                         opts: Optional[ResourceOptions] = None,
                         account_name: Optional[str] = None,
                         kind: Optional[Union[str, Kind]] = None,
                         resource_group_name: Optional[str] = None,
                         sinks: Optional[Sequence[VideoSinkArgs]] = None,
                         sku: Optional[SkuArgs] = None,
                         sources: Optional[Sequence[Union[RtspSourceArgs, VideoSourceArgs]]] = None,
                         description: Optional[str] = None,
                         parameters: Optional[Sequence[ParameterDeclarationArgs]] = None,
                         pipeline_topology_name: Optional[str] = None,
                         processors: Optional[Sequence[EncoderProcessorArgs]] = None)
    func NewPipelineTopology(ctx *Context, name string, args PipelineTopologyArgs, opts ...ResourceOption) (*PipelineTopology, error)
    public PipelineTopology(string name, PipelineTopologyArgs args, CustomResourceOptions? opts = null)
    public PipelineTopology(String name, PipelineTopologyArgs args)
    public PipelineTopology(String name, PipelineTopologyArgs args, CustomResourceOptions options)
    
    type: azure-native:videoanalyzer:PipelineTopology
    properties: # The arguments to resource properties.
    options: # Bag of options to control resource's behavior.
    
    

    Parameters

    name string
    The unique name of the resource.
    args PipelineTopologyArgs
    The arguments to resource properties.
    opts CustomResourceOptions
    Bag of options to control resource's behavior.
    resource_name str
    The unique name of the resource.
    args PipelineTopologyArgs
    The arguments to resource properties.
    opts ResourceOptions
    Bag of options to control resource's behavior.
    ctx Context
    Context object for the current deployment.
    name string
    The unique name of the resource.
    args PipelineTopologyArgs
    The arguments to resource properties.
    opts ResourceOption
    Bag of options to control resource's behavior.
    name string
    The unique name of the resource.
    args PipelineTopologyArgs
    The arguments to resource properties.
    opts CustomResourceOptions
    Bag of options to control resource's behavior.
    name String
    The unique name of the resource.
    args PipelineTopologyArgs
    The arguments to resource properties.
    options CustomResourceOptions
    Bag of options to control resource's behavior.

    Constructor example

    The following reference example uses placeholder values for all input properties.

    var pipelineTopologyResource = new AzureNative.Videoanalyzer.PipelineTopology("pipelineTopologyResource", new()
    {
        AccountName = "string",
        Kind = "string",
        ResourceGroupName = "string",
        Sinks = new[]
        {
            
            {
                { "inputs", new[]
                {
                    
                    {
                        { "nodeName", "string" },
                    },
                } },
                { "name", "string" },
                { "type", "#Microsoft.VideoAnalyzer.VideoSink" },
                { "videoName", "string" },
                { "videoCreationProperties", 
                {
                    { "description", "string" },
                    { "retentionPeriod", "string" },
                    { "segmentLength", "string" },
                    { "title", "string" },
                } },
                { "videoPublishingOptions", 
                {
                    { "disableArchive", "string" },
                    { "disableRtspPublishing", "string" },
                } },
            },
        },
        Sku = 
        {
            { "name", "string" },
        },
        Sources = new[]
        {
            
            {
                { "endpoint", 
                {
                    { "credentials", 
                    {
                        { "password", "string" },
                        { "type", "#Microsoft.VideoAnalyzer.UsernamePasswordCredentials" },
                        { "username", "string" },
                    } },
                    { "type", "#Microsoft.VideoAnalyzer.TlsEndpoint" },
                    { "url", "string" },
                    { "trustedCertificates", 
                    {
                        { "certificates", new[]
                        {
                            "string",
                        } },
                        { "type", "#Microsoft.VideoAnalyzer.PemCertificateList" },
                    } },
                    { "tunnel", 
                    {
                        { "deviceId", "string" },
                        { "iotHubName", "string" },
                        { "type", "#Microsoft.VideoAnalyzer.SecureIotDeviceRemoteTunnel" },
                    } },
                    { "validationOptions", 
                    {
                        { "ignoreHostname", "string" },
                        { "ignoreSignature", "string" },
                    } },
                } },
                { "name", "string" },
                { "type", "#Microsoft.VideoAnalyzer.RtspSource" },
                { "transport", "string" },
            },
        },
        Description = "string",
        Parameters = new[]
        {
            
            {
                { "name", "string" },
                { "type", "string" },
                { "default", "string" },
                { "description", "string" },
            },
        },
        PipelineTopologyName = "string",
        Processors = new[]
        {
            
            {
                { "inputs", new[]
                {
                    
                    {
                        { "nodeName", "string" },
                    },
                } },
                { "name", "string" },
                { "preset", 
                {
                    { "type", "#Microsoft.VideoAnalyzer.EncoderCustomPreset" },
                    { "audioEncoder", 
                    {
                        { "type", "#Microsoft.VideoAnalyzer.AudioEncoderAac" },
                        { "bitrateKbps", "string" },
                    } },
                    { "videoEncoder", 
                    {
                        { "type", "#Microsoft.VideoAnalyzer.VideoEncoderH264" },
                        { "bitrateKbps", "string" },
                        { "frameRate", "string" },
                        { "scale", 
                        {
                            { "height", "string" },
                            { "mode", "string" },
                            { "width", "string" },
                        } },
                    } },
                } },
                { "type", "#Microsoft.VideoAnalyzer.EncoderProcessor" },
            },
        },
    });
    
    example, err := videoanalyzer.NewPipelineTopology(ctx, "pipelineTopologyResource", &videoanalyzer.PipelineTopologyArgs{
    	AccountName:       "string",
    	Kind:              "string",
    	ResourceGroupName: "string",
    	Sinks: []map[string]interface{}{
    		map[string]interface{}{
    			"inputs": []map[string]interface{}{
    				map[string]interface{}{
    					"nodeName": "string",
    				},
    			},
    			"name":      "string",
    			"type":      "#Microsoft.VideoAnalyzer.VideoSink",
    			"videoName": "string",
    			"videoCreationProperties": map[string]interface{}{
    				"description":     "string",
    				"retentionPeriod": "string",
    				"segmentLength":   "string",
    				"title":           "string",
    			},
    			"videoPublishingOptions": map[string]interface{}{
    				"disableArchive":        "string",
    				"disableRtspPublishing": "string",
    			},
    		},
    	},
    	Sku: map[string]interface{}{
    		"name": "string",
    	},
    	Sources: []map[string]interface{}{
    		map[string]interface{}{
    			"endpoint": map[string]interface{}{
    				"credentials": map[string]interface{}{
    					"password": "string",
    					"type":     "#Microsoft.VideoAnalyzer.UsernamePasswordCredentials",
    					"username": "string",
    				},
    				"type": "#Microsoft.VideoAnalyzer.TlsEndpoint",
    				"url":  "string",
    				"trustedCertificates": map[string]interface{}{
    					"certificates": []string{
    						"string",
    					},
    					"type": "#Microsoft.VideoAnalyzer.PemCertificateList",
    				},
    				"tunnel": map[string]interface{}{
    					"deviceId":   "string",
    					"iotHubName": "string",
    					"type":       "#Microsoft.VideoAnalyzer.SecureIotDeviceRemoteTunnel",
    				},
    				"validationOptions": map[string]interface{}{
    					"ignoreHostname":  "string",
    					"ignoreSignature": "string",
    				},
    			},
    			"name":      "string",
    			"type":      "#Microsoft.VideoAnalyzer.RtspSource",
    			"transport": "string",
    		},
    	},
    	Description: "string",
    	Parameters: []map[string]interface{}{
    		map[string]interface{}{
    			"name":        "string",
    			"type":        "string",
    			"default":     "string",
    			"description": "string",
    		},
    	},
    	PipelineTopologyName: "string",
    	Processors: []map[string]interface{}{
    		map[string]interface{}{
    			"inputs": []map[string]interface{}{
    				map[string]interface{}{
    					"nodeName": "string",
    				},
    			},
    			"name": "string",
    			"preset": map[string]interface{}{
    				"type": "#Microsoft.VideoAnalyzer.EncoderCustomPreset",
    				"audioEncoder": map[string]interface{}{
    					"type":        "#Microsoft.VideoAnalyzer.AudioEncoderAac",
    					"bitrateKbps": "string",
    				},
    				"videoEncoder": map[string]interface{}{
    					"type":        "#Microsoft.VideoAnalyzer.VideoEncoderH264",
    					"bitrateKbps": "string",
    					"frameRate":   "string",
    					"scale": map[string]interface{}{
    						"height": "string",
    						"mode":   "string",
    						"width":  "string",
    					},
    				},
    			},
    			"type": "#Microsoft.VideoAnalyzer.EncoderProcessor",
    		},
    	},
    })
    
    var pipelineTopologyResource = new PipelineTopology("pipelineTopologyResource", PipelineTopologyArgs.builder()
        .accountName("string")
        .kind("string")
        .resourceGroupName("string")
        .sinks(%!v(PANIC=Format method: runtime error: invalid memory address or nil pointer dereference))
        .sku(%!v(PANIC=Format method: runtime error: invalid memory address or nil pointer dereference))
        .sources(%!v(PANIC=Format method: runtime error: invalid memory address or nil pointer dereference))
        .description("string")
        .parameters(%!v(PANIC=Format method: runtime error: invalid memory address or nil pointer dereference))
        .pipelineTopologyName("string")
        .processors(%!v(PANIC=Format method: runtime error: invalid memory address or nil pointer dereference))
        .build());
    
    pipeline_topology_resource = azure_native.videoanalyzer.PipelineTopology("pipelineTopologyResource",
        account_name=string,
        kind=string,
        resource_group_name=string,
        sinks=[{
            inputs: [{
                nodeName: string,
            }],
            name: string,
            type: #Microsoft.VideoAnalyzer.VideoSink,
            videoName: string,
            videoCreationProperties: {
                description: string,
                retentionPeriod: string,
                segmentLength: string,
                title: string,
            },
            videoPublishingOptions: {
                disableArchive: string,
                disableRtspPublishing: string,
            },
        }],
        sku={
            name: string,
        },
        sources=[{
            endpoint: {
                credentials: {
                    password: string,
                    type: #Microsoft.VideoAnalyzer.UsernamePasswordCredentials,
                    username: string,
                },
                type: #Microsoft.VideoAnalyzer.TlsEndpoint,
                url: string,
                trustedCertificates: {
                    certificates: [string],
                    type: #Microsoft.VideoAnalyzer.PemCertificateList,
                },
                tunnel: {
                    deviceId: string,
                    iotHubName: string,
                    type: #Microsoft.VideoAnalyzer.SecureIotDeviceRemoteTunnel,
                },
                validationOptions: {
                    ignoreHostname: string,
                    ignoreSignature: string,
                },
            },
            name: string,
            type: #Microsoft.VideoAnalyzer.RtspSource,
            transport: string,
        }],
        description=string,
        parameters=[{
            name: string,
            type: string,
            default: string,
            description: string,
        }],
        pipeline_topology_name=string,
        processors=[{
            inputs: [{
                nodeName: string,
            }],
            name: string,
            preset: {
                type: #Microsoft.VideoAnalyzer.EncoderCustomPreset,
                audioEncoder: {
                    type: #Microsoft.VideoAnalyzer.AudioEncoderAac,
                    bitrateKbps: string,
                },
                videoEncoder: {
                    type: #Microsoft.VideoAnalyzer.VideoEncoderH264,
                    bitrateKbps: string,
                    frameRate: string,
                    scale: {
                        height: string,
                        mode: string,
                        width: string,
                    },
                },
            },
            type: #Microsoft.VideoAnalyzer.EncoderProcessor,
        }])
    
    const pipelineTopologyResource = new azure_native.videoanalyzer.PipelineTopology("pipelineTopologyResource", {
        accountName: "string",
        kind: "string",
        resourceGroupName: "string",
        sinks: [{
            inputs: [{
                nodeName: "string",
            }],
            name: "string",
            type: "#Microsoft.VideoAnalyzer.VideoSink",
            videoName: "string",
            videoCreationProperties: {
                description: "string",
                retentionPeriod: "string",
                segmentLength: "string",
                title: "string",
            },
            videoPublishingOptions: {
                disableArchive: "string",
                disableRtspPublishing: "string",
            },
        }],
        sku: {
            name: "string",
        },
        sources: [{
            endpoint: {
                credentials: {
                    password: "string",
                    type: "#Microsoft.VideoAnalyzer.UsernamePasswordCredentials",
                    username: "string",
                },
                type: "#Microsoft.VideoAnalyzer.TlsEndpoint",
                url: "string",
                trustedCertificates: {
                    certificates: ["string"],
                    type: "#Microsoft.VideoAnalyzer.PemCertificateList",
                },
                tunnel: {
                    deviceId: "string",
                    iotHubName: "string",
                    type: "#Microsoft.VideoAnalyzer.SecureIotDeviceRemoteTunnel",
                },
                validationOptions: {
                    ignoreHostname: "string",
                    ignoreSignature: "string",
                },
            },
            name: "string",
            type: "#Microsoft.VideoAnalyzer.RtspSource",
            transport: "string",
        }],
        description: "string",
        parameters: [{
            name: "string",
            type: "string",
            "default": "string",
            description: "string",
        }],
        pipelineTopologyName: "string",
        processors: [{
            inputs: [{
                nodeName: "string",
            }],
            name: "string",
            preset: {
                type: "#Microsoft.VideoAnalyzer.EncoderCustomPreset",
                audioEncoder: {
                    type: "#Microsoft.VideoAnalyzer.AudioEncoderAac",
                    bitrateKbps: "string",
                },
                videoEncoder: {
                    type: "#Microsoft.VideoAnalyzer.VideoEncoderH264",
                    bitrateKbps: "string",
                    frameRate: "string",
                    scale: {
                        height: "string",
                        mode: "string",
                        width: "string",
                    },
                },
            },
            type: "#Microsoft.VideoAnalyzer.EncoderProcessor",
        }],
    });
    
    type: azure-native:videoanalyzer:PipelineTopology
    properties:
        accountName: string
        description: string
        kind: string
        parameters:
            - default: string
              description: string
              name: string
              type: string
        pipelineTopologyName: string
        processors:
            - inputs:
                - nodeName: string
              name: string
              preset:
                audioEncoder:
                    bitrateKbps: string
                    type: '#Microsoft.VideoAnalyzer.AudioEncoderAac'
                type: '#Microsoft.VideoAnalyzer.EncoderCustomPreset'
                videoEncoder:
                    bitrateKbps: string
                    frameRate: string
                    scale:
                        height: string
                        mode: string
                        width: string
                    type: '#Microsoft.VideoAnalyzer.VideoEncoderH264'
              type: '#Microsoft.VideoAnalyzer.EncoderProcessor'
        resourceGroupName: string
        sinks:
            - inputs:
                - nodeName: string
              name: string
              type: '#Microsoft.VideoAnalyzer.VideoSink'
              videoCreationProperties:
                description: string
                retentionPeriod: string
                segmentLength: string
                title: string
              videoName: string
              videoPublishingOptions:
                disableArchive: string
                disableRtspPublishing: string
        sku:
            name: string
        sources:
            - endpoint:
                credentials:
                    password: string
                    type: '#Microsoft.VideoAnalyzer.UsernamePasswordCredentials'
                    username: string
                trustedCertificates:
                    certificates:
                        - string
                    type: '#Microsoft.VideoAnalyzer.PemCertificateList'
                tunnel:
                    deviceId: string
                    iotHubName: string
                    type: '#Microsoft.VideoAnalyzer.SecureIotDeviceRemoteTunnel'
                type: '#Microsoft.VideoAnalyzer.TlsEndpoint'
                url: string
                validationOptions:
                    ignoreHostname: string
                    ignoreSignature: string
              name: string
              transport: string
              type: '#Microsoft.VideoAnalyzer.RtspSource'
    

    PipelineTopology Resource Properties

    To learn more about resource properties and how to use them, see Inputs and Outputs in the Architecture and Concepts docs.

    Inputs

    The PipelineTopology resource accepts the following input properties:

    AccountName string
    The Azure Video Analyzer account name.
    Kind string | Pulumi.AzureNative.VideoAnalyzer.Kind
    Topology kind.
    ResourceGroupName string
    The name of the resource group. The name is case insensitive.
    Sinks List<Pulumi.AzureNative.VideoAnalyzer.Inputs.VideoSink>
    List of the topology sink nodes. Sink nodes allow pipeline data to be stored or exported.
    Sku Pulumi.AzureNative.VideoAnalyzer.Inputs.Sku
    Describes the properties of a SKU.
    Sources List<Union<Pulumi.AzureNative.VideoAnalyzer.Inputs.RtspSource, Pulumi.AzureNative.VideoAnalyzer.Inputs.VideoSourceArgs>>
    List of the topology source nodes. Source nodes enable external data to be ingested by the pipeline.
    Description string
    An optional description of the pipeline topology. It is recommended that the expected use of the topology to be described here.
    Parameters List<Pulumi.AzureNative.VideoAnalyzer.Inputs.ParameterDeclaration>
    List of the topology parameter declarations. Parameters declared here can be referenced throughout the topology nodes through the use of "${PARAMETER_NAME}" string pattern. Parameters can have optional default values and can later be defined in individual instances of the pipeline.
    PipelineTopologyName string
    Pipeline topology unique identifier.
    Processors List<Pulumi.AzureNative.VideoAnalyzer.Inputs.EncoderProcessor>
    List of the topology processor nodes. Processor nodes enable pipeline data to be analyzed, processed or transformed.
    AccountName string
    The Azure Video Analyzer account name.
    Kind string | Kind
    Topology kind.
    ResourceGroupName string
    The name of the resource group. The name is case insensitive.
    Sinks []VideoSinkArgs
    List of the topology sink nodes. Sink nodes allow pipeline data to be stored or exported.
    Sku SkuArgs
    Describes the properties of a SKU.
    Sources []interface{}
    List of the topology source nodes. Source nodes enable external data to be ingested by the pipeline.
    Description string
    An optional description of the pipeline topology. It is recommended that the expected use of the topology to be described here.
    Parameters []ParameterDeclarationArgs
    List of the topology parameter declarations. Parameters declared here can be referenced throughout the topology nodes through the use of "${PARAMETER_NAME}" string pattern. Parameters can have optional default values and can later be defined in individual instances of the pipeline.
    PipelineTopologyName string
    Pipeline topology unique identifier.
    Processors []EncoderProcessorArgs
    List of the topology processor nodes. Processor nodes enable pipeline data to be analyzed, processed or transformed.
    accountName String
    The Azure Video Analyzer account name.
    kind String | Kind
    Topology kind.
    resourceGroupName String
    The name of the resource group. The name is case insensitive.
    sinks List<VideoSink>
    List of the topology sink nodes. Sink nodes allow pipeline data to be stored or exported.
    sku Sku
    Describes the properties of a SKU.
    sources List<Either<RtspSource,VideoSourceArgs>>
    List of the topology source nodes. Source nodes enable external data to be ingested by the pipeline.
    description String
    An optional description of the pipeline topology. It is recommended that the expected use of the topology to be described here.
    parameters List<ParameterDeclaration>
    List of the topology parameter declarations. Parameters declared here can be referenced throughout the topology nodes through the use of "${PARAMETER_NAME}" string pattern. Parameters can have optional default values and can later be defined in individual instances of the pipeline.
    pipelineTopologyName String
    Pipeline topology unique identifier.
    processors List<EncoderProcessor>
    List of the topology processor nodes. Processor nodes enable pipeline data to be analyzed, processed or transformed.
    accountName string
    The Azure Video Analyzer account name.
    kind string | Kind
    Topology kind.
    resourceGroupName string
    The name of the resource group. The name is case insensitive.
    sinks VideoSink[]
    List of the topology sink nodes. Sink nodes allow pipeline data to be stored or exported.
    sku Sku
    Describes the properties of a SKU.
    sources (RtspSource | VideoSourceArgs)[]
    List of the topology source nodes. Source nodes enable external data to be ingested by the pipeline.
    description string
    An optional description of the pipeline topology. It is recommended that the expected use of the topology to be described here.
    parameters ParameterDeclaration[]
    List of the topology parameter declarations. Parameters declared here can be referenced throughout the topology nodes through the use of "${PARAMETER_NAME}" string pattern. Parameters can have optional default values and can later be defined in individual instances of the pipeline.
    pipelineTopologyName string
    Pipeline topology unique identifier.
    processors EncoderProcessor[]
    List of the topology processor nodes. Processor nodes enable pipeline data to be analyzed, processed or transformed.
    account_name str
    The Azure Video Analyzer account name.
    kind str | Kind
    Topology kind.
    resource_group_name str
    The name of the resource group. The name is case insensitive.
    sinks Sequence[VideoSinkArgs]
    List of the topology sink nodes. Sink nodes allow pipeline data to be stored or exported.
    sku SkuArgs
    Describes the properties of a SKU.
    sources Sequence[Union[RtspSourceArgs, VideoSourceArgs]]
    List of the topology source nodes. Source nodes enable external data to be ingested by the pipeline.
    description str
    An optional description of the pipeline topology. It is recommended that the expected use of the topology to be described here.
    parameters Sequence[ParameterDeclarationArgs]
    List of the topology parameter declarations. Parameters declared here can be referenced throughout the topology nodes through the use of "${PARAMETER_NAME}" string pattern. Parameters can have optional default values and can later be defined in individual instances of the pipeline.
    pipeline_topology_name str
    Pipeline topology unique identifier.
    processors Sequence[EncoderProcessorArgs]
    List of the topology processor nodes. Processor nodes enable pipeline data to be analyzed, processed or transformed.
    accountName String
    The Azure Video Analyzer account name.
    kind String | "Live" | "Batch"
    Topology kind.
    resourceGroupName String
    The name of the resource group. The name is case insensitive.
    sinks List<Property Map>
    List of the topology sink nodes. Sink nodes allow pipeline data to be stored or exported.
    sku Property Map
    Describes the properties of a SKU.
    sources List<Property Map | Property Map>
    List of the topology source nodes. Source nodes enable external data to be ingested by the pipeline.
    description String
    An optional description of the pipeline topology. It is recommended that the expected use of the topology to be described here.
    parameters List<Property Map>
    List of the topology parameter declarations. Parameters declared here can be referenced throughout the topology nodes through the use of "${PARAMETER_NAME}" string pattern. Parameters can have optional default values and can later be defined in individual instances of the pipeline.
    pipelineTopologyName String
    Pipeline topology unique identifier.
    processors List<Property Map>
    List of the topology processor nodes. Processor nodes enable pipeline data to be analyzed, processed or transformed.

    Outputs

    All input properties are implicitly available as output properties. Additionally, the PipelineTopology resource produces the following output properties:

    Id string
    The provider-assigned unique ID for this managed resource.
    Name string
    The name of the resource
    SystemData Pulumi.AzureNative.VideoAnalyzer.Outputs.SystemDataResponse
    Azure Resource Manager metadata containing createdBy and modifiedBy information.
    Type string
    The type of the resource. E.g. "Microsoft.Compute/virtualMachines" or "Microsoft.Storage/storageAccounts"
    Id string
    The provider-assigned unique ID for this managed resource.
    Name string
    The name of the resource
    SystemData SystemDataResponse
    Azure Resource Manager metadata containing createdBy and modifiedBy information.
    Type string
    The type of the resource. E.g. "Microsoft.Compute/virtualMachines" or "Microsoft.Storage/storageAccounts"
    id String
    The provider-assigned unique ID for this managed resource.
    name String
    The name of the resource
    systemData SystemDataResponse
    Azure Resource Manager metadata containing createdBy and modifiedBy information.
    type String
    The type of the resource. E.g. "Microsoft.Compute/virtualMachines" or "Microsoft.Storage/storageAccounts"
    id string
    The provider-assigned unique ID for this managed resource.
    name string
    The name of the resource
    systemData SystemDataResponse
    Azure Resource Manager metadata containing createdBy and modifiedBy information.
    type string
    The type of the resource. E.g. "Microsoft.Compute/virtualMachines" or "Microsoft.Storage/storageAccounts"
    id str
    The provider-assigned unique ID for this managed resource.
    name str
    The name of the resource
    system_data SystemDataResponse
    Azure Resource Manager metadata containing createdBy and modifiedBy information.
    type str
    The type of the resource. E.g. "Microsoft.Compute/virtualMachines" or "Microsoft.Storage/storageAccounts"
    id String
    The provider-assigned unique ID for this managed resource.
    name String
    The name of the resource
    systemData Property Map
    Azure Resource Manager metadata containing createdBy and modifiedBy information.
    type String
    The type of the resource. E.g. "Microsoft.Compute/virtualMachines" or "Microsoft.Storage/storageAccounts"

    Supporting Types

    AudioEncoderAac, AudioEncoderAacArgs

    BitrateKbps string
    Bitrate, in kilobits per second or Kbps, at which audio should be encoded (2-channel stereo audio at a sampling rate of 48 kHz). Allowed values are 96, 112, 128, 160, 192, 224, and 256. If omitted, the bitrate of the input audio is used.
    BitrateKbps string
    Bitrate, in kilobits per second or Kbps, at which audio should be encoded (2-channel stereo audio at a sampling rate of 48 kHz). Allowed values are 96, 112, 128, 160, 192, 224, and 256. If omitted, the bitrate of the input audio is used.
    bitrateKbps String
    Bitrate, in kilobits per second or Kbps, at which audio should be encoded (2-channel stereo audio at a sampling rate of 48 kHz). Allowed values are 96, 112, 128, 160, 192, 224, and 256. If omitted, the bitrate of the input audio is used.
    bitrateKbps string
    Bitrate, in kilobits per second or Kbps, at which audio should be encoded (2-channel stereo audio at a sampling rate of 48 kHz). Allowed values are 96, 112, 128, 160, 192, 224, and 256. If omitted, the bitrate of the input audio is used.
    bitrate_kbps str
    Bitrate, in kilobits per second or Kbps, at which audio should be encoded (2-channel stereo audio at a sampling rate of 48 kHz). Allowed values are 96, 112, 128, 160, 192, 224, and 256. If omitted, the bitrate of the input audio is used.
    bitrateKbps String
    Bitrate, in kilobits per second or Kbps, at which audio should be encoded (2-channel stereo audio at a sampling rate of 48 kHz). Allowed values are 96, 112, 128, 160, 192, 224, and 256. If omitted, the bitrate of the input audio is used.

    AudioEncoderAacResponse, AudioEncoderAacResponseArgs

    BitrateKbps string
    Bitrate, in kilobits per second or Kbps, at which audio should be encoded (2-channel stereo audio at a sampling rate of 48 kHz). Allowed values are 96, 112, 128, 160, 192, 224, and 256. If omitted, the bitrate of the input audio is used.
    BitrateKbps string
    Bitrate, in kilobits per second or Kbps, at which audio should be encoded (2-channel stereo audio at a sampling rate of 48 kHz). Allowed values are 96, 112, 128, 160, 192, 224, and 256. If omitted, the bitrate of the input audio is used.
    bitrateKbps String
    Bitrate, in kilobits per second or Kbps, at which audio should be encoded (2-channel stereo audio at a sampling rate of 48 kHz). Allowed values are 96, 112, 128, 160, 192, 224, and 256. If omitted, the bitrate of the input audio is used.
    bitrateKbps string
    Bitrate, in kilobits per second or Kbps, at which audio should be encoded (2-channel stereo audio at a sampling rate of 48 kHz). Allowed values are 96, 112, 128, 160, 192, 224, and 256. If omitted, the bitrate of the input audio is used.
    bitrate_kbps str
    Bitrate, in kilobits per second or Kbps, at which audio should be encoded (2-channel stereo audio at a sampling rate of 48 kHz). Allowed values are 96, 112, 128, 160, 192, 224, and 256. If omitted, the bitrate of the input audio is used.
    bitrateKbps String
    Bitrate, in kilobits per second or Kbps, at which audio should be encoded (2-channel stereo audio at a sampling rate of 48 kHz). Allowed values are 96, 112, 128, 160, 192, 224, and 256. If omitted, the bitrate of the input audio is used.

    EncoderCustomPreset, EncoderCustomPresetArgs

    AudioEncoder Pulumi.AzureNative.VideoAnalyzer.Inputs.AudioEncoderAac
    Describes a custom preset for encoding audio.
    VideoEncoder Pulumi.AzureNative.VideoAnalyzer.Inputs.VideoEncoderH264
    Describes a custom preset for encoding video.
    AudioEncoder AudioEncoderAac
    Describes a custom preset for encoding audio.
    VideoEncoder VideoEncoderH264
    Describes a custom preset for encoding video.
    audioEncoder AudioEncoderAac
    Describes a custom preset for encoding audio.
    videoEncoder VideoEncoderH264
    Describes a custom preset for encoding video.
    audioEncoder AudioEncoderAac
    Describes a custom preset for encoding audio.
    videoEncoder VideoEncoderH264
    Describes a custom preset for encoding video.
    audio_encoder AudioEncoderAac
    Describes a custom preset for encoding audio.
    video_encoder VideoEncoderH264
    Describes a custom preset for encoding video.
    audioEncoder Property Map
    Describes a custom preset for encoding audio.
    videoEncoder Property Map
    Describes a custom preset for encoding video.

    EncoderCustomPresetResponse, EncoderCustomPresetResponseArgs

    AudioEncoder AudioEncoderAacResponse
    Describes a custom preset for encoding audio.
    VideoEncoder VideoEncoderH264Response
    Describes a custom preset for encoding video.
    audioEncoder AudioEncoderAacResponse
    Describes a custom preset for encoding audio.
    videoEncoder VideoEncoderH264Response
    Describes a custom preset for encoding video.
    audioEncoder AudioEncoderAacResponse
    Describes a custom preset for encoding audio.
    videoEncoder VideoEncoderH264Response
    Describes a custom preset for encoding video.
    audio_encoder AudioEncoderAacResponse
    Describes a custom preset for encoding audio.
    video_encoder VideoEncoderH264Response
    Describes a custom preset for encoding video.
    audioEncoder Property Map
    Describes a custom preset for encoding audio.
    videoEncoder Property Map
    Describes a custom preset for encoding video.

    EncoderProcessor, EncoderProcessorArgs

    Inputs List<Pulumi.AzureNative.VideoAnalyzer.Inputs.NodeInput>
    An array of upstream node references within the topology to be used as inputs for this node.
    Name string
    Node name. Must be unique within the topology.
    Preset Pulumi.AzureNative.VideoAnalyzer.Inputs.EncoderCustomPreset | Pulumi.AzureNative.VideoAnalyzer.Inputs.EncoderSystemPreset
    The encoder preset, which defines the recipe or instructions on how the input content should be processed.
    Inputs []NodeInput
    An array of upstream node references within the topology to be used as inputs for this node.
    Name string
    Node name. Must be unique within the topology.
    Preset EncoderCustomPreset | EncoderSystemPreset
    The encoder preset, which defines the recipe or instructions on how the input content should be processed.
    inputs List<NodeInput>
    An array of upstream node references within the topology to be used as inputs for this node.
    name String
    Node name. Must be unique within the topology.
    preset EncoderCustomPreset | EncoderSystemPreset
    The encoder preset, which defines the recipe or instructions on how the input content should be processed.
    inputs NodeInput[]
    An array of upstream node references within the topology to be used as inputs for this node.
    name string
    Node name. Must be unique within the topology.
    preset EncoderCustomPreset | EncoderSystemPreset
    The encoder preset, which defines the recipe or instructions on how the input content should be processed.
    inputs Sequence[NodeInput]
    An array of upstream node references within the topology to be used as inputs for this node.
    name str
    Node name. Must be unique within the topology.
    preset EncoderCustomPreset | EncoderSystemPreset
    The encoder preset, which defines the recipe or instructions on how the input content should be processed.
    inputs List<Property Map>
    An array of upstream node references within the topology to be used as inputs for this node.
    name String
    Node name. Must be unique within the topology.
    preset Property Map | Property Map
    The encoder preset, which defines the recipe or instructions on how the input content should be processed.

    EncoderProcessorResponse, EncoderProcessorResponseArgs

    Inputs List<Pulumi.AzureNative.VideoAnalyzer.Inputs.NodeInputResponse>
    An array of upstream node references within the topology to be used as inputs for this node.
    Name string
    Node name. Must be unique within the topology.
    Preset Pulumi.AzureNative.VideoAnalyzer.Inputs.EncoderCustomPresetResponse | Pulumi.AzureNative.VideoAnalyzer.Inputs.EncoderSystemPresetResponse
    The encoder preset, which defines the recipe or instructions on how the input content should be processed.
    Inputs []NodeInputResponse
    An array of upstream node references within the topology to be used as inputs for this node.
    Name string
    Node name. Must be unique within the topology.
    Preset EncoderCustomPresetResponse | EncoderSystemPresetResponse
    The encoder preset, which defines the recipe or instructions on how the input content should be processed.
    inputs List<NodeInputResponse>
    An array of upstream node references within the topology to be used as inputs for this node.
    name String
    Node name. Must be unique within the topology.
    preset EncoderCustomPresetResponse | EncoderSystemPresetResponse
    The encoder preset, which defines the recipe or instructions on how the input content should be processed.
    inputs NodeInputResponse[]
    An array of upstream node references within the topology to be used as inputs for this node.
    name string
    Node name. Must be unique within the topology.
    preset EncoderCustomPresetResponse | EncoderSystemPresetResponse
    The encoder preset, which defines the recipe or instructions on how the input content should be processed.
    inputs Sequence[NodeInputResponse]
    An array of upstream node references within the topology to be used as inputs for this node.
    name str
    Node name. Must be unique within the topology.
    preset EncoderCustomPresetResponse | EncoderSystemPresetResponse
    The encoder preset, which defines the recipe or instructions on how the input content should be processed.
    inputs List<Property Map>
    An array of upstream node references within the topology to be used as inputs for this node.
    name String
    Node name. Must be unique within the topology.
    preset Property Map | Property Map
    The encoder preset, which defines the recipe or instructions on how the input content should be processed.

    EncoderSystemPreset, EncoderSystemPresetArgs

    Name string | Pulumi.AzureNative.VideoAnalyzer.EncoderSystemPresetType
    Name of the built-in encoding preset.
    Name string | EncoderSystemPresetType
    Name of the built-in encoding preset.
    name String | EncoderSystemPresetType
    Name of the built-in encoding preset.
    name string | EncoderSystemPresetType
    Name of the built-in encoding preset.
    name str | EncoderSystemPresetType
    Name of the built-in encoding preset.

    EncoderSystemPresetResponse, EncoderSystemPresetResponseArgs

    Name string
    Name of the built-in encoding preset.
    Name string
    Name of the built-in encoding preset.
    name String
    Name of the built-in encoding preset.
    name string
    Name of the built-in encoding preset.
    name str
    Name of the built-in encoding preset.
    name String
    Name of the built-in encoding preset.

    EncoderSystemPresetType, EncoderSystemPresetTypeArgs

    SingleLayer_540p_H264_AAC
    SingleLayer_540p_H264_AACProduces an MP4 file where the video is encoded with H.264 codec at a picture height of 540 pixels, and at a maximum bitrate of 2000 Kbps. Encoded video has the same average frame rate as the input. The aspect ratio of the input is preserved. If the input content has audio, then it is encoded with AAC-LC codec at 96 Kbps
    SingleLayer_720p_H264_AAC
    SingleLayer_720p_H264_AACProduces an MP4 file where the video is encoded with H.264 codec at a picture height of 720 pixels, and at a maximum bitrate of 3500 Kbps. Encoded video has the same average frame rate as the input. The aspect ratio of the input is preserved. If the input content has audio, then it is encoded with AAC-LC codec at 96 Kbps
    SingleLayer_1080p_H264_AAC
    SingleLayer_1080p_H264_AACProduces an MP4 file where the video is encoded with H.264 codec at a picture height of 1080 pixels, and at a maximum bitrate of 6000 Kbps. Encoded video has the same average frame rate as the input. The aspect ratio of the input is preserved. If the input content has audio, then it is encoded with AAC-LC codec at 128 Kbps
    SingleLayer_2160p_H264_AAC
    SingleLayer_2160p_H264_AACProduces an MP4 file where the video is encoded with H.264 codec at a picture height of 2160 pixels, and at a maximum bitrate of 16000 Kbps. Encoded video has the same average frame rate as the input. The aspect ratio of the input is preserved. If the input content has audio, then it is encoded with AAC-LC codec at 128 Kbps
    EncoderSystemPresetType_SingleLayer_540p_H264_AAC
    SingleLayer_540p_H264_AACProduces an MP4 file where the video is encoded with H.264 codec at a picture height of 540 pixels, and at a maximum bitrate of 2000 Kbps. Encoded video has the same average frame rate as the input. The aspect ratio of the input is preserved. If the input content has audio, then it is encoded with AAC-LC codec at 96 Kbps
    EncoderSystemPresetType_SingleLayer_720p_H264_AAC
    SingleLayer_720p_H264_AACProduces an MP4 file where the video is encoded with H.264 codec at a picture height of 720 pixels, and at a maximum bitrate of 3500 Kbps. Encoded video has the same average frame rate as the input. The aspect ratio of the input is preserved. If the input content has audio, then it is encoded with AAC-LC codec at 96 Kbps
    EncoderSystemPresetType_SingleLayer_1080p_H264_AAC
    SingleLayer_1080p_H264_AACProduces an MP4 file where the video is encoded with H.264 codec at a picture height of 1080 pixels, and at a maximum bitrate of 6000 Kbps. Encoded video has the same average frame rate as the input. The aspect ratio of the input is preserved. If the input content has audio, then it is encoded with AAC-LC codec at 128 Kbps
    EncoderSystemPresetType_SingleLayer_2160p_H264_AAC
    SingleLayer_2160p_H264_AACProduces an MP4 file where the video is encoded with H.264 codec at a picture height of 2160 pixels, and at a maximum bitrate of 16000 Kbps. Encoded video has the same average frame rate as the input. The aspect ratio of the input is preserved. If the input content has audio, then it is encoded with AAC-LC codec at 128 Kbps
    SingleLayer_540p_H264_AAC
    SingleLayer_540p_H264_AACProduces an MP4 file where the video is encoded with H.264 codec at a picture height of 540 pixels, and at a maximum bitrate of 2000 Kbps. Encoded video has the same average frame rate as the input. The aspect ratio of the input is preserved. If the input content has audio, then it is encoded with AAC-LC codec at 96 Kbps
    SingleLayer_720p_H264_AAC
    SingleLayer_720p_H264_AACProduces an MP4 file where the video is encoded with H.264 codec at a picture height of 720 pixels, and at a maximum bitrate of 3500 Kbps. Encoded video has the same average frame rate as the input. The aspect ratio of the input is preserved. If the input content has audio, then it is encoded with AAC-LC codec at 96 Kbps
    SingleLayer_1080p_H264_AAC
    SingleLayer_1080p_H264_AACProduces an MP4 file where the video is encoded with H.264 codec at a picture height of 1080 pixels, and at a maximum bitrate of 6000 Kbps. Encoded video has the same average frame rate as the input. The aspect ratio of the input is preserved. If the input content has audio, then it is encoded with AAC-LC codec at 128 Kbps
    SingleLayer_2160p_H264_AAC
    SingleLayer_2160p_H264_AACProduces an MP4 file where the video is encoded with H.264 codec at a picture height of 2160 pixels, and at a maximum bitrate of 16000 Kbps. Encoded video has the same average frame rate as the input. The aspect ratio of the input is preserved. If the input content has audio, then it is encoded with AAC-LC codec at 128 Kbps
    SingleLayer_540p_H264_AAC
    SingleLayer_540p_H264_AACProduces an MP4 file where the video is encoded with H.264 codec at a picture height of 540 pixels, and at a maximum bitrate of 2000 Kbps. Encoded video has the same average frame rate as the input. The aspect ratio of the input is preserved. If the input content has audio, then it is encoded with AAC-LC codec at 96 Kbps
    SingleLayer_720p_H264_AAC
    SingleLayer_720p_H264_AACProduces an MP4 file where the video is encoded with H.264 codec at a picture height of 720 pixels, and at a maximum bitrate of 3500 Kbps. Encoded video has the same average frame rate as the input. The aspect ratio of the input is preserved. If the input content has audio, then it is encoded with AAC-LC codec at 96 Kbps
    SingleLayer_1080p_H264_AAC
    SingleLayer_1080p_H264_AACProduces an MP4 file where the video is encoded with H.264 codec at a picture height of 1080 pixels, and at a maximum bitrate of 6000 Kbps. Encoded video has the same average frame rate as the input. The aspect ratio of the input is preserved. If the input content has audio, then it is encoded with AAC-LC codec at 128 Kbps
    SingleLayer_2160p_H264_AAC
    SingleLayer_2160p_H264_AACProduces an MP4 file where the video is encoded with H.264 codec at a picture height of 2160 pixels, and at a maximum bitrate of 16000 Kbps. Encoded video has the same average frame rate as the input. The aspect ratio of the input is preserved. If the input content has audio, then it is encoded with AAC-LC codec at 128 Kbps
    SINGLE_LAYER_540P_H264_AAC
    SingleLayer_540p_H264_AACProduces an MP4 file where the video is encoded with H.264 codec at a picture height of 540 pixels, and at a maximum bitrate of 2000 Kbps. Encoded video has the same average frame rate as the input. The aspect ratio of the input is preserved. If the input content has audio, then it is encoded with AAC-LC codec at 96 Kbps
    SINGLE_LAYER_720P_H264_AAC
    SingleLayer_720p_H264_AACProduces an MP4 file where the video is encoded with H.264 codec at a picture height of 720 pixels, and at a maximum bitrate of 3500 Kbps. Encoded video has the same average frame rate as the input. The aspect ratio of the input is preserved. If the input content has audio, then it is encoded with AAC-LC codec at 96 Kbps
    SINGLE_LAYER_1080P_H264_AAC
    SingleLayer_1080p_H264_AACProduces an MP4 file where the video is encoded with H.264 codec at a picture height of 1080 pixels, and at a maximum bitrate of 6000 Kbps. Encoded video has the same average frame rate as the input. The aspect ratio of the input is preserved. If the input content has audio, then it is encoded with AAC-LC codec at 128 Kbps
    SINGLE_LAYER_2160P_H264_AAC
    SingleLayer_2160p_H264_AACProduces an MP4 file where the video is encoded with H.264 codec at a picture height of 2160 pixels, and at a maximum bitrate of 16000 Kbps. Encoded video has the same average frame rate as the input. The aspect ratio of the input is preserved. If the input content has audio, then it is encoded with AAC-LC codec at 128 Kbps
    "SingleLayer_540p_H264_AAC"
    SingleLayer_540p_H264_AACProduces an MP4 file where the video is encoded with H.264 codec at a picture height of 540 pixels, and at a maximum bitrate of 2000 Kbps. Encoded video has the same average frame rate as the input. The aspect ratio of the input is preserved. If the input content has audio, then it is encoded with AAC-LC codec at 96 Kbps
    "SingleLayer_720p_H264_AAC"
    SingleLayer_720p_H264_AACProduces an MP4 file where the video is encoded with H.264 codec at a picture height of 720 pixels, and at a maximum bitrate of 3500 Kbps. Encoded video has the same average frame rate as the input. The aspect ratio of the input is preserved. If the input content has audio, then it is encoded with AAC-LC codec at 96 Kbps
    "SingleLayer_1080p_H264_AAC"
    SingleLayer_1080p_H264_AACProduces an MP4 file where the video is encoded with H.264 codec at a picture height of 1080 pixels, and at a maximum bitrate of 6000 Kbps. Encoded video has the same average frame rate as the input. The aspect ratio of the input is preserved. If the input content has audio, then it is encoded with AAC-LC codec at 128 Kbps
    "SingleLayer_2160p_H264_AAC"
    SingleLayer_2160p_H264_AACProduces an MP4 file where the video is encoded with H.264 codec at a picture height of 2160 pixels, and at a maximum bitrate of 16000 Kbps. Encoded video has the same average frame rate as the input. The aspect ratio of the input is preserved. If the input content has audio, then it is encoded with AAC-LC codec at 128 Kbps

    Kind, KindArgs

    Live
    LiveLive pipeline topology resource.
    Batch
    BatchBatch pipeline topology resource.
    KindLive
    LiveLive pipeline topology resource.
    KindBatch
    BatchBatch pipeline topology resource.
    Live
    LiveLive pipeline topology resource.
    Batch
    BatchBatch pipeline topology resource.
    Live
    LiveLive pipeline topology resource.
    Batch
    BatchBatch pipeline topology resource.
    LIVE
    LiveLive pipeline topology resource.
    BATCH
    BatchBatch pipeline topology resource.
    "Live"
    LiveLive pipeline topology resource.
    "Batch"
    BatchBatch pipeline topology resource.

    NodeInput, NodeInputArgs

    NodeName string
    The name of the upstream node in the pipeline which output is used as input of the current node.
    NodeName string
    The name of the upstream node in the pipeline which output is used as input of the current node.
    nodeName String
    The name of the upstream node in the pipeline which output is used as input of the current node.
    nodeName string
    The name of the upstream node in the pipeline which output is used as input of the current node.
    node_name str
    The name of the upstream node in the pipeline which output is used as input of the current node.
    nodeName String
    The name of the upstream node in the pipeline which output is used as input of the current node.

    NodeInputResponse, NodeInputResponseArgs

    NodeName string
    The name of the upstream node in the pipeline which output is used as input of the current node.
    NodeName string
    The name of the upstream node in the pipeline which output is used as input of the current node.
    nodeName String
    The name of the upstream node in the pipeline which output is used as input of the current node.
    nodeName string
    The name of the upstream node in the pipeline which output is used as input of the current node.
    node_name str
    The name of the upstream node in the pipeline which output is used as input of the current node.
    nodeName String
    The name of the upstream node in the pipeline which output is used as input of the current node.

    ParameterDeclaration, ParameterDeclarationArgs

    Name string
    Name of the parameter.
    Type string | Pulumi.AzureNative.VideoAnalyzer.ParameterType
    Type of the parameter.
    Default string
    The default value for the parameter to be used if the pipeline does not specify a value.
    Description string
    Description of the parameter.
    Name string
    Name of the parameter.
    Type string | ParameterType
    Type of the parameter.
    Default string
    The default value for the parameter to be used if the pipeline does not specify a value.
    Description string
    Description of the parameter.
    name String
    Name of the parameter.
    type String | ParameterType
    Type of the parameter.
    default_ String
    The default value for the parameter to be used if the pipeline does not specify a value.
    description String
    Description of the parameter.
    name string
    Name of the parameter.
    type string | ParameterType
    Type of the parameter.
    default string
    The default value for the parameter to be used if the pipeline does not specify a value.
    description string
    Description of the parameter.
    name str
    Name of the parameter.
    type str | ParameterType
    Type of the parameter.
    default str
    The default value for the parameter to be used if the pipeline does not specify a value.
    description str
    Description of the parameter.
    name String
    Name of the parameter.
    type String | "String" | "SecretString" | "Int" | "Double" | "Bool"
    Type of the parameter.
    default String
    The default value for the parameter to be used if the pipeline does not specify a value.
    description String
    Description of the parameter.

    ParameterDeclarationResponse, ParameterDeclarationResponseArgs

    Name string
    Name of the parameter.
    Type string
    Type of the parameter.
    Default string
    The default value for the parameter to be used if the pipeline does not specify a value.
    Description string
    Description of the parameter.
    Name string
    Name of the parameter.
    Type string
    Type of the parameter.
    Default string
    The default value for the parameter to be used if the pipeline does not specify a value.
    Description string
    Description of the parameter.
    name String
    Name of the parameter.
    type String
    Type of the parameter.
    default_ String
    The default value for the parameter to be used if the pipeline does not specify a value.
    description String
    Description of the parameter.
    name string
    Name of the parameter.
    type string
    Type of the parameter.
    default string
    The default value for the parameter to be used if the pipeline does not specify a value.
    description string
    Description of the parameter.
    name str
    Name of the parameter.
    type str
    Type of the parameter.
    default str
    The default value for the parameter to be used if the pipeline does not specify a value.
    description str
    Description of the parameter.
    name String
    Name of the parameter.
    type String
    Type of the parameter.
    default String
    The default value for the parameter to be used if the pipeline does not specify a value.
    description String
    Description of the parameter.

    ParameterType, ParameterTypeArgs

    String
    StringThe parameter's value is a string.
    SecretString
    SecretStringThe parameter's value is a string that holds sensitive information.
    Int
    IntThe parameter's value is a 32-bit signed integer.
    Double
    DoubleThe parameter's value is a 64-bit double-precision floating point.
    Bool
    BoolThe parameter's value is a boolean value that is either true or false.
    ParameterTypeString
    StringThe parameter's value is a string.
    ParameterTypeSecretString
    SecretStringThe parameter's value is a string that holds sensitive information.
    ParameterTypeInt
    IntThe parameter's value is a 32-bit signed integer.
    ParameterTypeDouble
    DoubleThe parameter's value is a 64-bit double-precision floating point.
    ParameterTypeBool
    BoolThe parameter's value is a boolean value that is either true or false.
    String
    StringThe parameter's value is a string.
    SecretString
    SecretStringThe parameter's value is a string that holds sensitive information.
    Int
    IntThe parameter's value is a 32-bit signed integer.
    Double
    DoubleThe parameter's value is a 64-bit double-precision floating point.
    Bool
    BoolThe parameter's value is a boolean value that is either true or false.
    String
    StringThe parameter's value is a string.
    SecretString
    SecretStringThe parameter's value is a string that holds sensitive information.
    Int
    IntThe parameter's value is a 32-bit signed integer.
    Double
    DoubleThe parameter's value is a 64-bit double-precision floating point.
    Bool
    BoolThe parameter's value is a boolean value that is either true or false.
    STRING
    StringThe parameter's value is a string.
    SECRET_STRING
    SecretStringThe parameter's value is a string that holds sensitive information.
    INT
    IntThe parameter's value is a 32-bit signed integer.
    DOUBLE
    DoubleThe parameter's value is a 64-bit double-precision floating point.
    BOOL
    BoolThe parameter's value is a boolean value that is either true or false.
    "String"
    StringThe parameter's value is a string.
    "SecretString"
    SecretStringThe parameter's value is a string that holds sensitive information.
    "Int"
    IntThe parameter's value is a 32-bit signed integer.
    "Double"
    DoubleThe parameter's value is a 64-bit double-precision floating point.
    "Bool"
    BoolThe parameter's value is a boolean value that is either true or false.

    PemCertificateList, PemCertificateListArgs

    Certificates List<string>
    PEM formatted public certificates. One certificate per entry.
    Certificates []string
    PEM formatted public certificates. One certificate per entry.
    certificates List<String>
    PEM formatted public certificates. One certificate per entry.
    certificates string[]
    PEM formatted public certificates. One certificate per entry.
    certificates Sequence[str]
    PEM formatted public certificates. One certificate per entry.
    certificates List<String>
    PEM formatted public certificates. One certificate per entry.

    PemCertificateListResponse, PemCertificateListResponseArgs

    Certificates List<string>
    PEM formatted public certificates. One certificate per entry.
    Certificates []string
    PEM formatted public certificates. One certificate per entry.
    certificates List<String>
    PEM formatted public certificates. One certificate per entry.
    certificates string[]
    PEM formatted public certificates. One certificate per entry.
    certificates Sequence[str]
    PEM formatted public certificates. One certificate per entry.
    certificates List<String>
    PEM formatted public certificates. One certificate per entry.

    RtspSource, RtspSourceArgs

    Endpoint Pulumi.AzureNative.VideoAnalyzer.Inputs.TlsEndpoint | Pulumi.AzureNative.VideoAnalyzer.Inputs.UnsecuredEndpoint
    RTSP endpoint information for Video Analyzer to connect to. This contains the required information for Video Analyzer to connect to RTSP cameras and/or generic RTSP servers.
    Name string
    Node name. Must be unique within the topology.
    Transport string | Pulumi.AzureNative.VideoAnalyzer.RtspTransport
    Network transport utilized by the RTSP and RTP exchange: TCP or HTTP. When using TCP, the RTP packets are interleaved on the TCP RTSP connection. When using HTTP, the RTSP messages are exchanged through long lived HTTP connections, and the RTP packages are interleaved in the HTTP connections alongside the RTSP messages.
    Endpoint TlsEndpoint | UnsecuredEndpoint
    RTSP endpoint information for Video Analyzer to connect to. This contains the required information for Video Analyzer to connect to RTSP cameras and/or generic RTSP servers.
    Name string
    Node name. Must be unique within the topology.
    Transport string | RtspTransport
    Network transport utilized by the RTSP and RTP exchange: TCP or HTTP. When using TCP, the RTP packets are interleaved on the TCP RTSP connection. When using HTTP, the RTSP messages are exchanged through long lived HTTP connections, and the RTP packages are interleaved in the HTTP connections alongside the RTSP messages.
    endpoint TlsEndpoint | UnsecuredEndpoint
    RTSP endpoint information for Video Analyzer to connect to. This contains the required information for Video Analyzer to connect to RTSP cameras and/or generic RTSP servers.
    name String
    Node name. Must be unique within the topology.
    transport String | RtspTransport
    Network transport utilized by the RTSP and RTP exchange: TCP or HTTP. When using TCP, the RTP packets are interleaved on the TCP RTSP connection. When using HTTP, the RTSP messages are exchanged through long lived HTTP connections, and the RTP packages are interleaved in the HTTP connections alongside the RTSP messages.
    endpoint TlsEndpoint | UnsecuredEndpoint
    RTSP endpoint information for Video Analyzer to connect to. This contains the required information for Video Analyzer to connect to RTSP cameras and/or generic RTSP servers.
    name string
    Node name. Must be unique within the topology.
    transport string | RtspTransport
    Network transport utilized by the RTSP and RTP exchange: TCP or HTTP. When using TCP, the RTP packets are interleaved on the TCP RTSP connection. When using HTTP, the RTSP messages are exchanged through long lived HTTP connections, and the RTP packages are interleaved in the HTTP connections alongside the RTSP messages.
    endpoint TlsEndpoint | UnsecuredEndpoint
    RTSP endpoint information for Video Analyzer to connect to. This contains the required information for Video Analyzer to connect to RTSP cameras and/or generic RTSP servers.
    name str
    Node name. Must be unique within the topology.
    transport str | RtspTransport
    Network transport utilized by the RTSP and RTP exchange: TCP or HTTP. When using TCP, the RTP packets are interleaved on the TCP RTSP connection. When using HTTP, the RTSP messages are exchanged through long lived HTTP connections, and the RTP packages are interleaved in the HTTP connections alongside the RTSP messages.
    endpoint Property Map | Property Map
    RTSP endpoint information for Video Analyzer to connect to. This contains the required information for Video Analyzer to connect to RTSP cameras and/or generic RTSP servers.
    name String
    Node name. Must be unique within the topology.
    transport String | "Http" | "Tcp"
    Network transport utilized by the RTSP and RTP exchange: TCP or HTTP. When using TCP, the RTP packets are interleaved on the TCP RTSP connection. When using HTTP, the RTSP messages are exchanged through long lived HTTP connections, and the RTP packages are interleaved in the HTTP connections alongside the RTSP messages.

    RtspSourceResponse, RtspSourceResponseArgs

    Endpoint Pulumi.AzureNative.VideoAnalyzer.Inputs.TlsEndpointResponse | Pulumi.AzureNative.VideoAnalyzer.Inputs.UnsecuredEndpointResponse
    RTSP endpoint information for Video Analyzer to connect to. This contains the required information for Video Analyzer to connect to RTSP cameras and/or generic RTSP servers.
    Name string
    Node name. Must be unique within the topology.
    Transport string
    Network transport utilized by the RTSP and RTP exchange: TCP or HTTP. When using TCP, the RTP packets are interleaved on the TCP RTSP connection. When using HTTP, the RTSP messages are exchanged through long lived HTTP connections, and the RTP packages are interleaved in the HTTP connections alongside the RTSP messages.
    Endpoint TlsEndpointResponse | UnsecuredEndpointResponse
    RTSP endpoint information for Video Analyzer to connect to. This contains the required information for Video Analyzer to connect to RTSP cameras and/or generic RTSP servers.
    Name string
    Node name. Must be unique within the topology.
    Transport string
    Network transport utilized by the RTSP and RTP exchange: TCP or HTTP. When using TCP, the RTP packets are interleaved on the TCP RTSP connection. When using HTTP, the RTSP messages are exchanged through long lived HTTP connections, and the RTP packages are interleaved in the HTTP connections alongside the RTSP messages.
    endpoint TlsEndpointResponse | UnsecuredEndpointResponse
    RTSP endpoint information for Video Analyzer to connect to. This contains the required information for Video Analyzer to connect to RTSP cameras and/or generic RTSP servers.
    name String
    Node name. Must be unique within the topology.
    transport String
    Network transport utilized by the RTSP and RTP exchange: TCP or HTTP. When using TCP, the RTP packets are interleaved on the TCP RTSP connection. When using HTTP, the RTSP messages are exchanged through long lived HTTP connections, and the RTP packages are interleaved in the HTTP connections alongside the RTSP messages.
    endpoint TlsEndpointResponse | UnsecuredEndpointResponse
    RTSP endpoint information for Video Analyzer to connect to. This contains the required information for Video Analyzer to connect to RTSP cameras and/or generic RTSP servers.
    name string
    Node name. Must be unique within the topology.
    transport string
    Network transport utilized by the RTSP and RTP exchange: TCP or HTTP. When using TCP, the RTP packets are interleaved on the TCP RTSP connection. When using HTTP, the RTSP messages are exchanged through long lived HTTP connections, and the RTP packages are interleaved in the HTTP connections alongside the RTSP messages.
    endpoint TlsEndpointResponse | UnsecuredEndpointResponse
    RTSP endpoint information for Video Analyzer to connect to. This contains the required information for Video Analyzer to connect to RTSP cameras and/or generic RTSP servers.
    name str
    Node name. Must be unique within the topology.
    transport str
    Network transport utilized by the RTSP and RTP exchange: TCP or HTTP. When using TCP, the RTP packets are interleaved on the TCP RTSP connection. When using HTTP, the RTSP messages are exchanged through long lived HTTP connections, and the RTP packages are interleaved in the HTTP connections alongside the RTSP messages.
    endpoint Property Map | Property Map
    RTSP endpoint information for Video Analyzer to connect to. This contains the required information for Video Analyzer to connect to RTSP cameras and/or generic RTSP servers.
    name String
    Node name. Must be unique within the topology.
    transport String
    Network transport utilized by the RTSP and RTP exchange: TCP or HTTP. When using TCP, the RTP packets are interleaved on the TCP RTSP connection. When using HTTP, the RTSP messages are exchanged through long lived HTTP connections, and the RTP packages are interleaved in the HTTP connections alongside the RTSP messages.

    RtspTransport, RtspTransportArgs

    Http
    HttpHTTP transport. RTSP messages are exchanged over long running HTTP requests and RTP packets are interleaved within the HTTP channel.
    Tcp
    TcpTCP transport. RTSP is used directly over TCP and RTP packets are interleaved within the TCP channel.
    RtspTransportHttp
    HttpHTTP transport. RTSP messages are exchanged over long running HTTP requests and RTP packets are interleaved within the HTTP channel.
    RtspTransportTcp
    TcpTCP transport. RTSP is used directly over TCP and RTP packets are interleaved within the TCP channel.
    Http
    HttpHTTP transport. RTSP messages are exchanged over long running HTTP requests and RTP packets are interleaved within the HTTP channel.
    Tcp
    TcpTCP transport. RTSP is used directly over TCP and RTP packets are interleaved within the TCP channel.
    Http
    HttpHTTP transport. RTSP messages are exchanged over long running HTTP requests and RTP packets are interleaved within the HTTP channel.
    Tcp
    TcpTCP transport. RTSP is used directly over TCP and RTP packets are interleaved within the TCP channel.
    HTTP
    HttpHTTP transport. RTSP messages are exchanged over long running HTTP requests and RTP packets are interleaved within the HTTP channel.
    TCP
    TcpTCP transport. RTSP is used directly over TCP and RTP packets are interleaved within the TCP channel.
    "Http"
    HttpHTTP transport. RTSP messages are exchanged over long running HTTP requests and RTP packets are interleaved within the HTTP channel.
    "Tcp"
    TcpTCP transport. RTSP is used directly over TCP and RTP packets are interleaved within the TCP channel.

    SecureIotDeviceRemoteTunnel, SecureIotDeviceRemoteTunnelArgs

    DeviceId string
    The IoT device id to use when establishing the remote tunnel. This string is case-sensitive.
    IotHubName string
    Name of the IoT Hub.
    DeviceId string
    The IoT device id to use when establishing the remote tunnel. This string is case-sensitive.
    IotHubName string
    Name of the IoT Hub.
    deviceId String
    The IoT device id to use when establishing the remote tunnel. This string is case-sensitive.
    iotHubName String
    Name of the IoT Hub.
    deviceId string
    The IoT device id to use when establishing the remote tunnel. This string is case-sensitive.
    iotHubName string
    Name of the IoT Hub.
    device_id str
    The IoT device id to use when establishing the remote tunnel. This string is case-sensitive.
    iot_hub_name str
    Name of the IoT Hub.
    deviceId String
    The IoT device id to use when establishing the remote tunnel. This string is case-sensitive.
    iotHubName String
    Name of the IoT Hub.

    SecureIotDeviceRemoteTunnelResponse, SecureIotDeviceRemoteTunnelResponseArgs

    DeviceId string
    The IoT device id to use when establishing the remote tunnel. This string is case-sensitive.
    IotHubName string
    Name of the IoT Hub.
    DeviceId string
    The IoT device id to use when establishing the remote tunnel. This string is case-sensitive.
    IotHubName string
    Name of the IoT Hub.
    deviceId String
    The IoT device id to use when establishing the remote tunnel. This string is case-sensitive.
    iotHubName String
    Name of the IoT Hub.
    deviceId string
    The IoT device id to use when establishing the remote tunnel. This string is case-sensitive.
    iotHubName string
    Name of the IoT Hub.
    device_id str
    The IoT device id to use when establishing the remote tunnel. This string is case-sensitive.
    iot_hub_name str
    Name of the IoT Hub.
    deviceId String
    The IoT device id to use when establishing the remote tunnel. This string is case-sensitive.
    iotHubName String
    Name of the IoT Hub.

    Sku, SkuArgs

    Name string | SkuName
    The SKU name.
    name String | SkuName
    The SKU name.
    name string | SkuName
    The SKU name.
    name str | SkuName
    The SKU name.
    name String | "Live_S1" | "Batch_S1"
    The SKU name.

    SkuName, SkuNameArgs

    Live_S1
    Live_S1Represents the Live S1 SKU name. Using this SKU you can create live pipelines to capture, record, and stream live video from RTSP-capable cameras at bitrate settings from 0.5 Kbps to 3000 Kbps.
    Batch_S1
    Batch_S1Represents the Batch S1 SKU name. Using this SKU you can create pipeline jobs to process recorded content.
    SkuName_Live_S1
    Live_S1Represents the Live S1 SKU name. Using this SKU you can create live pipelines to capture, record, and stream live video from RTSP-capable cameras at bitrate settings from 0.5 Kbps to 3000 Kbps.
    SkuName_Batch_S1
    Batch_S1Represents the Batch S1 SKU name. Using this SKU you can create pipeline jobs to process recorded content.
    Live_S1
    Live_S1Represents the Live S1 SKU name. Using this SKU you can create live pipelines to capture, record, and stream live video from RTSP-capable cameras at bitrate settings from 0.5 Kbps to 3000 Kbps.
    Batch_S1
    Batch_S1Represents the Batch S1 SKU name. Using this SKU you can create pipeline jobs to process recorded content.
    Live_S1
    Live_S1Represents the Live S1 SKU name. Using this SKU you can create live pipelines to capture, record, and stream live video from RTSP-capable cameras at bitrate settings from 0.5 Kbps to 3000 Kbps.
    Batch_S1
    Batch_S1Represents the Batch S1 SKU name. Using this SKU you can create pipeline jobs to process recorded content.
    LIVE_S1
    Live_S1Represents the Live S1 SKU name. Using this SKU you can create live pipelines to capture, record, and stream live video from RTSP-capable cameras at bitrate settings from 0.5 Kbps to 3000 Kbps.
    BATCH_S1
    Batch_S1Represents the Batch S1 SKU name. Using this SKU you can create pipeline jobs to process recorded content.
    "Live_S1"
    Live_S1Represents the Live S1 SKU name. Using this SKU you can create live pipelines to capture, record, and stream live video from RTSP-capable cameras at bitrate settings from 0.5 Kbps to 3000 Kbps.
    "Batch_S1"
    Batch_S1Represents the Batch S1 SKU name. Using this SKU you can create pipeline jobs to process recorded content.

    SkuResponse, SkuResponseArgs

    Name string
    The SKU name.
    Tier string
    The SKU tier.
    Name string
    The SKU name.
    Tier string
    The SKU tier.
    name String
    The SKU name.
    tier String
    The SKU tier.
    name string
    The SKU name.
    tier string
    The SKU tier.
    name str
    The SKU name.
    tier str
    The SKU tier.
    name String
    The SKU name.
    tier String
    The SKU tier.

    SystemDataResponse, SystemDataResponseArgs

    CreatedAt string
    The timestamp of resource creation (UTC).
    CreatedBy string
    The identity that created the resource.
    CreatedByType string
    The type of identity that created the resource.
    LastModifiedAt string
    The timestamp of resource last modification (UTC)
    LastModifiedBy string
    The identity that last modified the resource.
    LastModifiedByType string
    The type of identity that last modified the resource.
    CreatedAt string
    The timestamp of resource creation (UTC).
    CreatedBy string
    The identity that created the resource.
    CreatedByType string
    The type of identity that created the resource.
    LastModifiedAt string
    The timestamp of resource last modification (UTC)
    LastModifiedBy string
    The identity that last modified the resource.
    LastModifiedByType string
    The type of identity that last modified the resource.
    createdAt String
    The timestamp of resource creation (UTC).
    createdBy String
    The identity that created the resource.
    createdByType String
    The type of identity that created the resource.
    lastModifiedAt String
    The timestamp of resource last modification (UTC)
    lastModifiedBy String
    The identity that last modified the resource.
    lastModifiedByType String
    The type of identity that last modified the resource.
    createdAt string
    The timestamp of resource creation (UTC).
    createdBy string
    The identity that created the resource.
    createdByType string
    The type of identity that created the resource.
    lastModifiedAt string
    The timestamp of resource last modification (UTC)
    lastModifiedBy string
    The identity that last modified the resource.
    lastModifiedByType string
    The type of identity that last modified the resource.
    created_at str
    The timestamp of resource creation (UTC).
    created_by str
    The identity that created the resource.
    created_by_type str
    The type of identity that created the resource.
    last_modified_at str
    The timestamp of resource last modification (UTC)
    last_modified_by str
    The identity that last modified the resource.
    last_modified_by_type str
    The type of identity that last modified the resource.
    createdAt String
    The timestamp of resource creation (UTC).
    createdBy String
    The identity that created the resource.
    createdByType String
    The type of identity that created the resource.
    lastModifiedAt String
    The timestamp of resource last modification (UTC)
    lastModifiedBy String
    The identity that last modified the resource.
    lastModifiedByType String
    The type of identity that last modified the resource.

    TlsEndpoint, TlsEndpointArgs

    Credentials Pulumi.AzureNative.VideoAnalyzer.Inputs.UsernamePasswordCredentials
    Credentials to be presented to the endpoint.
    Url string
    The endpoint URL for Video Analyzer to connect to.
    TrustedCertificates Pulumi.AzureNative.VideoAnalyzer.Inputs.PemCertificateList
    List of trusted certificate authorities when authenticating a TLS connection. A null list designates that Azure Video Analyzer's list of trusted authorities should be used.
    Tunnel Pulumi.AzureNative.VideoAnalyzer.Inputs.SecureIotDeviceRemoteTunnel
    Describes the tunnel through which Video Analyzer can connect to the endpoint URL. This is an optional property, typically used when the endpoint is behind a firewall.
    ValidationOptions Pulumi.AzureNative.VideoAnalyzer.Inputs.TlsValidationOptions
    Validation options to use when authenticating a TLS connection. By default, strict validation is used.
    Credentials UsernamePasswordCredentials
    Credentials to be presented to the endpoint.
    Url string
    The endpoint URL for Video Analyzer to connect to.
    TrustedCertificates PemCertificateList
    List of trusted certificate authorities when authenticating a TLS connection. A null list designates that Azure Video Analyzer's list of trusted authorities should be used.
    Tunnel SecureIotDeviceRemoteTunnel
    Describes the tunnel through which Video Analyzer can connect to the endpoint URL. This is an optional property, typically used when the endpoint is behind a firewall.
    ValidationOptions TlsValidationOptions
    Validation options to use when authenticating a TLS connection. By default, strict validation is used.
    credentials UsernamePasswordCredentials
    Credentials to be presented to the endpoint.
    url String
    The endpoint URL for Video Analyzer to connect to.
    trustedCertificates PemCertificateList
    List of trusted certificate authorities when authenticating a TLS connection. A null list designates that Azure Video Analyzer's list of trusted authorities should be used.
    tunnel SecureIotDeviceRemoteTunnel
    Describes the tunnel through which Video Analyzer can connect to the endpoint URL. This is an optional property, typically used when the endpoint is behind a firewall.
    validationOptions TlsValidationOptions
    Validation options to use when authenticating a TLS connection. By default, strict validation is used.
    credentials UsernamePasswordCredentials
    Credentials to be presented to the endpoint.
    url string
    The endpoint URL for Video Analyzer to connect to.
    trustedCertificates PemCertificateList
    List of trusted certificate authorities when authenticating a TLS connection. A null list designates that Azure Video Analyzer's list of trusted authorities should be used.
    tunnel SecureIotDeviceRemoteTunnel
    Describes the tunnel through which Video Analyzer can connect to the endpoint URL. This is an optional property, typically used when the endpoint is behind a firewall.
    validationOptions TlsValidationOptions
    Validation options to use when authenticating a TLS connection. By default, strict validation is used.
    credentials UsernamePasswordCredentials
    Credentials to be presented to the endpoint.
    url str
    The endpoint URL for Video Analyzer to connect to.
    trusted_certificates PemCertificateList
    List of trusted certificate authorities when authenticating a TLS connection. A null list designates that Azure Video Analyzer's list of trusted authorities should be used.
    tunnel SecureIotDeviceRemoteTunnel
    Describes the tunnel through which Video Analyzer can connect to the endpoint URL. This is an optional property, typically used when the endpoint is behind a firewall.
    validation_options TlsValidationOptions
    Validation options to use when authenticating a TLS connection. By default, strict validation is used.
    credentials Property Map
    Credentials to be presented to the endpoint.
    url String
    The endpoint URL for Video Analyzer to connect to.
    trustedCertificates Property Map
    List of trusted certificate authorities when authenticating a TLS connection. A null list designates that Azure Video Analyzer's list of trusted authorities should be used.
    tunnel Property Map
    Describes the tunnel through which Video Analyzer can connect to the endpoint URL. This is an optional property, typically used when the endpoint is behind a firewall.
    validationOptions Property Map
    Validation options to use when authenticating a TLS connection. By default, strict validation is used.

    TlsEndpointResponse, TlsEndpointResponseArgs

    Credentials Pulumi.AzureNative.VideoAnalyzer.Inputs.UsernamePasswordCredentialsResponse
    Credentials to be presented to the endpoint.
    Url string
    The endpoint URL for Video Analyzer to connect to.
    TrustedCertificates Pulumi.AzureNative.VideoAnalyzer.Inputs.PemCertificateListResponse
    List of trusted certificate authorities when authenticating a TLS connection. A null list designates that Azure Video Analyzer's list of trusted authorities should be used.
    Tunnel Pulumi.AzureNative.VideoAnalyzer.Inputs.SecureIotDeviceRemoteTunnelResponse
    Describes the tunnel through which Video Analyzer can connect to the endpoint URL. This is an optional property, typically used when the endpoint is behind a firewall.
    ValidationOptions Pulumi.AzureNative.VideoAnalyzer.Inputs.TlsValidationOptionsResponse
    Validation options to use when authenticating a TLS connection. By default, strict validation is used.
    Credentials UsernamePasswordCredentialsResponse
    Credentials to be presented to the endpoint.
    Url string
    The endpoint URL for Video Analyzer to connect to.
    TrustedCertificates PemCertificateListResponse
    List of trusted certificate authorities when authenticating a TLS connection. A null list designates that Azure Video Analyzer's list of trusted authorities should be used.
    Tunnel SecureIotDeviceRemoteTunnelResponse
    Describes the tunnel through which Video Analyzer can connect to the endpoint URL. This is an optional property, typically used when the endpoint is behind a firewall.
    ValidationOptions TlsValidationOptionsResponse
    Validation options to use when authenticating a TLS connection. By default, strict validation is used.
    credentials UsernamePasswordCredentialsResponse
    Credentials to be presented to the endpoint.
    url String
    The endpoint URL for Video Analyzer to connect to.
    trustedCertificates PemCertificateListResponse
    List of trusted certificate authorities when authenticating a TLS connection. A null list designates that Azure Video Analyzer's list of trusted authorities should be used.
    tunnel SecureIotDeviceRemoteTunnelResponse
    Describes the tunnel through which Video Analyzer can connect to the endpoint URL. This is an optional property, typically used when the endpoint is behind a firewall.
    validationOptions TlsValidationOptionsResponse
    Validation options to use when authenticating a TLS connection. By default, strict validation is used.
    credentials UsernamePasswordCredentialsResponse
    Credentials to be presented to the endpoint.
    url string
    The endpoint URL for Video Analyzer to connect to.
    trustedCertificates PemCertificateListResponse
    List of trusted certificate authorities when authenticating a TLS connection. A null list designates that Azure Video Analyzer's list of trusted authorities should be used.
    tunnel SecureIotDeviceRemoteTunnelResponse
    Describes the tunnel through which Video Analyzer can connect to the endpoint URL. This is an optional property, typically used when the endpoint is behind a firewall.
    validationOptions TlsValidationOptionsResponse
    Validation options to use when authenticating a TLS connection. By default, strict validation is used.
    credentials UsernamePasswordCredentialsResponse
    Credentials to be presented to the endpoint.
    url str
    The endpoint URL for Video Analyzer to connect to.
    trusted_certificates PemCertificateListResponse
    List of trusted certificate authorities when authenticating a TLS connection. A null list designates that Azure Video Analyzer's list of trusted authorities should be used.
    tunnel SecureIotDeviceRemoteTunnelResponse
    Describes the tunnel through which Video Analyzer can connect to the endpoint URL. This is an optional property, typically used when the endpoint is behind a firewall.
    validation_options TlsValidationOptionsResponse
    Validation options to use when authenticating a TLS connection. By default, strict validation is used.
    credentials Property Map
    Credentials to be presented to the endpoint.
    url String
    The endpoint URL for Video Analyzer to connect to.
    trustedCertificates Property Map
    List of trusted certificate authorities when authenticating a TLS connection. A null list designates that Azure Video Analyzer's list of trusted authorities should be used.
    tunnel Property Map
    Describes the tunnel through which Video Analyzer can connect to the endpoint URL. This is an optional property, typically used when the endpoint is behind a firewall.
    validationOptions Property Map
    Validation options to use when authenticating a TLS connection. By default, strict validation is used.

    TlsValidationOptions, TlsValidationOptionsArgs

    IgnoreHostname string
    When set to 'true' causes the certificate subject name validation to be skipped. Default is 'false'.
    IgnoreSignature string
    When set to 'true' causes the certificate chain trust validation to be skipped. Default is 'false'.
    IgnoreHostname string
    When set to 'true' causes the certificate subject name validation to be skipped. Default is 'false'.
    IgnoreSignature string
    When set to 'true' causes the certificate chain trust validation to be skipped. Default is 'false'.
    ignoreHostname String
    When set to 'true' causes the certificate subject name validation to be skipped. Default is 'false'.
    ignoreSignature String
    When set to 'true' causes the certificate chain trust validation to be skipped. Default is 'false'.
    ignoreHostname string
    When set to 'true' causes the certificate subject name validation to be skipped. Default is 'false'.
    ignoreSignature string
    When set to 'true' causes the certificate chain trust validation to be skipped. Default is 'false'.
    ignore_hostname str
    When set to 'true' causes the certificate subject name validation to be skipped. Default is 'false'.
    ignore_signature str
    When set to 'true' causes the certificate chain trust validation to be skipped. Default is 'false'.
    ignoreHostname String
    When set to 'true' causes the certificate subject name validation to be skipped. Default is 'false'.
    ignoreSignature String
    When set to 'true' causes the certificate chain trust validation to be skipped. Default is 'false'.

    TlsValidationOptionsResponse, TlsValidationOptionsResponseArgs

    IgnoreHostname string
    When set to 'true' causes the certificate subject name validation to be skipped. Default is 'false'.
    IgnoreSignature string
    When set to 'true' causes the certificate chain trust validation to be skipped. Default is 'false'.
    IgnoreHostname string
    When set to 'true' causes the certificate subject name validation to be skipped. Default is 'false'.
    IgnoreSignature string
    When set to 'true' causes the certificate chain trust validation to be skipped. Default is 'false'.
    ignoreHostname String
    When set to 'true' causes the certificate subject name validation to be skipped. Default is 'false'.
    ignoreSignature String
    When set to 'true' causes the certificate chain trust validation to be skipped. Default is 'false'.
    ignoreHostname string
    When set to 'true' causes the certificate subject name validation to be skipped. Default is 'false'.
    ignoreSignature string
    When set to 'true' causes the certificate chain trust validation to be skipped. Default is 'false'.
    ignore_hostname str
    When set to 'true' causes the certificate subject name validation to be skipped. Default is 'false'.
    ignore_signature str
    When set to 'true' causes the certificate chain trust validation to be skipped. Default is 'false'.
    ignoreHostname String
    When set to 'true' causes the certificate subject name validation to be skipped. Default is 'false'.
    ignoreSignature String
    When set to 'true' causes the certificate chain trust validation to be skipped. Default is 'false'.

    UnsecuredEndpoint, UnsecuredEndpointArgs

    Credentials Pulumi.AzureNative.VideoAnalyzer.Inputs.UsernamePasswordCredentials
    Credentials to be presented to the endpoint.
    Url string
    The endpoint URL for Video Analyzer to connect to.
    Tunnel Pulumi.AzureNative.VideoAnalyzer.Inputs.SecureIotDeviceRemoteTunnel
    Describes the tunnel through which Video Analyzer can connect to the endpoint URL. This is an optional property, typically used when the endpoint is behind a firewall.
    Credentials UsernamePasswordCredentials
    Credentials to be presented to the endpoint.
    Url string
    The endpoint URL for Video Analyzer to connect to.
    Tunnel SecureIotDeviceRemoteTunnel
    Describes the tunnel through which Video Analyzer can connect to the endpoint URL. This is an optional property, typically used when the endpoint is behind a firewall.
    credentials UsernamePasswordCredentials
    Credentials to be presented to the endpoint.
    url String
    The endpoint URL for Video Analyzer to connect to.
    tunnel SecureIotDeviceRemoteTunnel
    Describes the tunnel through which Video Analyzer can connect to the endpoint URL. This is an optional property, typically used when the endpoint is behind a firewall.
    credentials UsernamePasswordCredentials
    Credentials to be presented to the endpoint.
    url string
    The endpoint URL for Video Analyzer to connect to.
    tunnel SecureIotDeviceRemoteTunnel
    Describes the tunnel through which Video Analyzer can connect to the endpoint URL. This is an optional property, typically used when the endpoint is behind a firewall.
    credentials UsernamePasswordCredentials
    Credentials to be presented to the endpoint.
    url str
    The endpoint URL for Video Analyzer to connect to.
    tunnel SecureIotDeviceRemoteTunnel
    Describes the tunnel through which Video Analyzer can connect to the endpoint URL. This is an optional property, typically used when the endpoint is behind a firewall.
    credentials Property Map
    Credentials to be presented to the endpoint.
    url String
    The endpoint URL for Video Analyzer to connect to.
    tunnel Property Map
    Describes the tunnel through which Video Analyzer can connect to the endpoint URL. This is an optional property, typically used when the endpoint is behind a firewall.

    UnsecuredEndpointResponse, UnsecuredEndpointResponseArgs

    Credentials Pulumi.AzureNative.VideoAnalyzer.Inputs.UsernamePasswordCredentialsResponse
    Credentials to be presented to the endpoint.
    Url string
    The endpoint URL for Video Analyzer to connect to.
    Tunnel Pulumi.AzureNative.VideoAnalyzer.Inputs.SecureIotDeviceRemoteTunnelResponse
    Describes the tunnel through which Video Analyzer can connect to the endpoint URL. This is an optional property, typically used when the endpoint is behind a firewall.
    Credentials UsernamePasswordCredentialsResponse
    Credentials to be presented to the endpoint.
    Url string
    The endpoint URL for Video Analyzer to connect to.
    Tunnel SecureIotDeviceRemoteTunnelResponse
    Describes the tunnel through which Video Analyzer can connect to the endpoint URL. This is an optional property, typically used when the endpoint is behind a firewall.
    credentials UsernamePasswordCredentialsResponse
    Credentials to be presented to the endpoint.
    url String
    The endpoint URL for Video Analyzer to connect to.
    tunnel SecureIotDeviceRemoteTunnelResponse
    Describes the tunnel through which Video Analyzer can connect to the endpoint URL. This is an optional property, typically used when the endpoint is behind a firewall.
    credentials UsernamePasswordCredentialsResponse
    Credentials to be presented to the endpoint.
    url string
    The endpoint URL for Video Analyzer to connect to.
    tunnel SecureIotDeviceRemoteTunnelResponse
    Describes the tunnel through which Video Analyzer can connect to the endpoint URL. This is an optional property, typically used when the endpoint is behind a firewall.
    credentials UsernamePasswordCredentialsResponse
    Credentials to be presented to the endpoint.
    url str
    The endpoint URL for Video Analyzer to connect to.
    tunnel SecureIotDeviceRemoteTunnelResponse
    Describes the tunnel through which Video Analyzer can connect to the endpoint URL. This is an optional property, typically used when the endpoint is behind a firewall.
    credentials Property Map
    Credentials to be presented to the endpoint.
    url String
    The endpoint URL for Video Analyzer to connect to.
    tunnel Property Map
    Describes the tunnel through which Video Analyzer can connect to the endpoint URL. This is an optional property, typically used when the endpoint is behind a firewall.

    UsernamePasswordCredentials, UsernamePasswordCredentialsArgs

    Password string
    Password to be presented as part of the credentials. It is recommended that this value is parameterized as a secret string in order to prevent this value to be returned as part of the resource on API requests.
    Username string
    Username to be presented as part of the credentials.
    Password string
    Password to be presented as part of the credentials. It is recommended that this value is parameterized as a secret string in order to prevent this value to be returned as part of the resource on API requests.
    Username string
    Username to be presented as part of the credentials.
    password String
    Password to be presented as part of the credentials. It is recommended that this value is parameterized as a secret string in order to prevent this value to be returned as part of the resource on API requests.
    username String
    Username to be presented as part of the credentials.
    password string
    Password to be presented as part of the credentials. It is recommended that this value is parameterized as a secret string in order to prevent this value to be returned as part of the resource on API requests.
    username string
    Username to be presented as part of the credentials.
    password str
    Password to be presented as part of the credentials. It is recommended that this value is parameterized as a secret string in order to prevent this value to be returned as part of the resource on API requests.
    username str
    Username to be presented as part of the credentials.
    password String
    Password to be presented as part of the credentials. It is recommended that this value is parameterized as a secret string in order to prevent this value to be returned as part of the resource on API requests.
    username String
    Username to be presented as part of the credentials.

    UsernamePasswordCredentialsResponse, UsernamePasswordCredentialsResponseArgs

    Password string
    Password to be presented as part of the credentials. It is recommended that this value is parameterized as a secret string in order to prevent this value to be returned as part of the resource on API requests.
    Username string
    Username to be presented as part of the credentials.
    Password string
    Password to be presented as part of the credentials. It is recommended that this value is parameterized as a secret string in order to prevent this value to be returned as part of the resource on API requests.
    Username string
    Username to be presented as part of the credentials.
    password String
    Password to be presented as part of the credentials. It is recommended that this value is parameterized as a secret string in order to prevent this value to be returned as part of the resource on API requests.
    username String
    Username to be presented as part of the credentials.
    password string
    Password to be presented as part of the credentials. It is recommended that this value is parameterized as a secret string in order to prevent this value to be returned as part of the resource on API requests.
    username string
    Username to be presented as part of the credentials.
    password str
    Password to be presented as part of the credentials. It is recommended that this value is parameterized as a secret string in order to prevent this value to be returned as part of the resource on API requests.
    username str
    Username to be presented as part of the credentials.
    password String
    Password to be presented as part of the credentials. It is recommended that this value is parameterized as a secret string in order to prevent this value to be returned as part of the resource on API requests.
    username String
    Username to be presented as part of the credentials.

    VideoCreationProperties, VideoCreationPropertiesArgs

    Description string
    Optional description provided by the user. Value can be up to 2048 characters long.
    RetentionPeriod string
    Video retention period indicates how long the video is kept in storage. Value must be specified in ISO8601 duration format (i.e. "P1D" equals 1 day) and can vary between 1 day to 10 years, in 1 day increments. When absent (null), all video content is retained indefinitely. This property is only allowed for topologies where "kind" is set to "live".
    SegmentLength string
    Segment length indicates the length of individual content files (segments) which are persisted to storage. Smaller segments provide lower archive playback latency but generate larger volume of storage transactions. Larger segments reduce the amount of storage transactions while increasing the archive playback latency. Value must be specified in ISO8601 duration format (i.e. "PT30S" equals 30 seconds) and can vary between 30 seconds to 5 minutes, in 30 seconds increments. Changing this value after the initial call to create the video resource can lead to errors when uploading content to the archive. Default value is 30 seconds. This property is only allowed for topologies where "kind" is set to "live".
    Title string
    Optional title provided by the user. Value can be up to 256 characters long.
    Description string
    Optional description provided by the user. Value can be up to 2048 characters long.
    RetentionPeriod string
    Video retention period indicates how long the video is kept in storage. Value must be specified in ISO8601 duration format (i.e. "P1D" equals 1 day) and can vary between 1 day to 10 years, in 1 day increments. When absent (null), all video content is retained indefinitely. This property is only allowed for topologies where "kind" is set to "live".
    SegmentLength string
    Segment length indicates the length of individual content files (segments) which are persisted to storage. Smaller segments provide lower archive playback latency but generate larger volume of storage transactions. Larger segments reduce the amount of storage transactions while increasing the archive playback latency. Value must be specified in ISO8601 duration format (i.e. "PT30S" equals 30 seconds) and can vary between 30 seconds to 5 minutes, in 30 seconds increments. Changing this value after the initial call to create the video resource can lead to errors when uploading content to the archive. Default value is 30 seconds. This property is only allowed for topologies where "kind" is set to "live".
    Title string
    Optional title provided by the user. Value can be up to 256 characters long.
    description String
    Optional description provided by the user. Value can be up to 2048 characters long.
    retentionPeriod String
    Video retention period indicates how long the video is kept in storage. Value must be specified in ISO8601 duration format (i.e. "P1D" equals 1 day) and can vary between 1 day to 10 years, in 1 day increments. When absent (null), all video content is retained indefinitely. This property is only allowed for topologies where "kind" is set to "live".
    segmentLength String
    Segment length indicates the length of individual content files (segments) which are persisted to storage. Smaller segments provide lower archive playback latency but generate larger volume of storage transactions. Larger segments reduce the amount of storage transactions while increasing the archive playback latency. Value must be specified in ISO8601 duration format (i.e. "PT30S" equals 30 seconds) and can vary between 30 seconds to 5 minutes, in 30 seconds increments. Changing this value after the initial call to create the video resource can lead to errors when uploading content to the archive. Default value is 30 seconds. This property is only allowed for topologies where "kind" is set to "live".
    title String
    Optional title provided by the user. Value can be up to 256 characters long.
    description string
    Optional description provided by the user. Value can be up to 2048 characters long.
    retentionPeriod string
    Video retention period indicates how long the video is kept in storage. Value must be specified in ISO8601 duration format (i.e. "P1D" equals 1 day) and can vary between 1 day to 10 years, in 1 day increments. When absent (null), all video content is retained indefinitely. This property is only allowed for topologies where "kind" is set to "live".
    segmentLength string
    Segment length indicates the length of individual content files (segments) which are persisted to storage. Smaller segments provide lower archive playback latency but generate larger volume of storage transactions. Larger segments reduce the amount of storage transactions while increasing the archive playback latency. Value must be specified in ISO8601 duration format (i.e. "PT30S" equals 30 seconds) and can vary between 30 seconds to 5 minutes, in 30 seconds increments. Changing this value after the initial call to create the video resource can lead to errors when uploading content to the archive. Default value is 30 seconds. This property is only allowed for topologies where "kind" is set to "live".
    title string
    Optional title provided by the user. Value can be up to 256 characters long.
    description str
    Optional description provided by the user. Value can be up to 2048 characters long.
    retention_period str
    Video retention period indicates how long the video is kept in storage. Value must be specified in ISO8601 duration format (i.e. "P1D" equals 1 day) and can vary between 1 day to 10 years, in 1 day increments. When absent (null), all video content is retained indefinitely. This property is only allowed for topologies where "kind" is set to "live".
    segment_length str
    Segment length indicates the length of individual content files (segments) which are persisted to storage. Smaller segments provide lower archive playback latency but generate larger volume of storage transactions. Larger segments reduce the amount of storage transactions while increasing the archive playback latency. Value must be specified in ISO8601 duration format (i.e. "PT30S" equals 30 seconds) and can vary between 30 seconds to 5 minutes, in 30 seconds increments. Changing this value after the initial call to create the video resource can lead to errors when uploading content to the archive. Default value is 30 seconds. This property is only allowed for topologies where "kind" is set to "live".
    title str
    Optional title provided by the user. Value can be up to 256 characters long.
    description String
    Optional description provided by the user. Value can be up to 2048 characters long.
    retentionPeriod String
    Video retention period indicates how long the video is kept in storage. Value must be specified in ISO8601 duration format (i.e. "P1D" equals 1 day) and can vary between 1 day to 10 years, in 1 day increments. When absent (null), all video content is retained indefinitely. This property is only allowed for topologies where "kind" is set to "live".
    segmentLength String
    Segment length indicates the length of individual content files (segments) which are persisted to storage. Smaller segments provide lower archive playback latency but generate larger volume of storage transactions. Larger segments reduce the amount of storage transactions while increasing the archive playback latency. Value must be specified in ISO8601 duration format (i.e. "PT30S" equals 30 seconds) and can vary between 30 seconds to 5 minutes, in 30 seconds increments. Changing this value after the initial call to create the video resource can lead to errors when uploading content to the archive. Default value is 30 seconds. This property is only allowed for topologies where "kind" is set to "live".
    title String
    Optional title provided by the user. Value can be up to 256 characters long.

    VideoCreationPropertiesResponse, VideoCreationPropertiesResponseArgs

    Description string
    Optional description provided by the user. Value can be up to 2048 characters long.
    RetentionPeriod string
    Video retention period indicates how long the video is kept in storage. Value must be specified in ISO8601 duration format (i.e. "P1D" equals 1 day) and can vary between 1 day to 10 years, in 1 day increments. When absent (null), all video content is retained indefinitely. This property is only allowed for topologies where "kind" is set to "live".
    SegmentLength string
    Segment length indicates the length of individual content files (segments) which are persisted to storage. Smaller segments provide lower archive playback latency but generate larger volume of storage transactions. Larger segments reduce the amount of storage transactions while increasing the archive playback latency. Value must be specified in ISO8601 duration format (i.e. "PT30S" equals 30 seconds) and can vary between 30 seconds to 5 minutes, in 30 seconds increments. Changing this value after the initial call to create the video resource can lead to errors when uploading content to the archive. Default value is 30 seconds. This property is only allowed for topologies where "kind" is set to "live".
    Title string
    Optional title provided by the user. Value can be up to 256 characters long.
    Description string
    Optional description provided by the user. Value can be up to 2048 characters long.
    RetentionPeriod string
    Video retention period indicates how long the video is kept in storage. Value must be specified in ISO8601 duration format (i.e. "P1D" equals 1 day) and can vary between 1 day to 10 years, in 1 day increments. When absent (null), all video content is retained indefinitely. This property is only allowed for topologies where "kind" is set to "live".
    SegmentLength string
    Segment length indicates the length of individual content files (segments) which are persisted to storage. Smaller segments provide lower archive playback latency but generate larger volume of storage transactions. Larger segments reduce the amount of storage transactions while increasing the archive playback latency. Value must be specified in ISO8601 duration format (i.e. "PT30S" equals 30 seconds) and can vary between 30 seconds to 5 minutes, in 30 seconds increments. Changing this value after the initial call to create the video resource can lead to errors when uploading content to the archive. Default value is 30 seconds. This property is only allowed for topologies where "kind" is set to "live".
    Title string
    Optional title provided by the user. Value can be up to 256 characters long.
    description String
    Optional description provided by the user. Value can be up to 2048 characters long.
    retentionPeriod String
    Video retention period indicates how long the video is kept in storage. Value must be specified in ISO8601 duration format (i.e. "P1D" equals 1 day) and can vary between 1 day to 10 years, in 1 day increments. When absent (null), all video content is retained indefinitely. This property is only allowed for topologies where "kind" is set to "live".
    segmentLength String
    Segment length indicates the length of individual content files (segments) which are persisted to storage. Smaller segments provide lower archive playback latency but generate larger volume of storage transactions. Larger segments reduce the amount of storage transactions while increasing the archive playback latency. Value must be specified in ISO8601 duration format (i.e. "PT30S" equals 30 seconds) and can vary between 30 seconds to 5 minutes, in 30 seconds increments. Changing this value after the initial call to create the video resource can lead to errors when uploading content to the archive. Default value is 30 seconds. This property is only allowed for topologies where "kind" is set to "live".
    title String
    Optional title provided by the user. Value can be up to 256 characters long.
    description string
    Optional description provided by the user. Value can be up to 2048 characters long.
    retentionPeriod string
    Video retention period indicates how long the video is kept in storage. Value must be specified in ISO8601 duration format (i.e. "P1D" equals 1 day) and can vary between 1 day to 10 years, in 1 day increments. When absent (null), all video content is retained indefinitely. This property is only allowed for topologies where "kind" is set to "live".
    segmentLength string
    Segment length indicates the length of individual content files (segments) which are persisted to storage. Smaller segments provide lower archive playback latency but generate larger volume of storage transactions. Larger segments reduce the amount of storage transactions while increasing the archive playback latency. Value must be specified in ISO8601 duration format (i.e. "PT30S" equals 30 seconds) and can vary between 30 seconds to 5 minutes, in 30 seconds increments. Changing this value after the initial call to create the video resource can lead to errors when uploading content to the archive. Default value is 30 seconds. This property is only allowed for topologies where "kind" is set to "live".
    title string
    Optional title provided by the user. Value can be up to 256 characters long.
    description str
    Optional description provided by the user. Value can be up to 2048 characters long.
    retention_period str
    Video retention period indicates how long the video is kept in storage. Value must be specified in ISO8601 duration format (i.e. "P1D" equals 1 day) and can vary between 1 day to 10 years, in 1 day increments. When absent (null), all video content is retained indefinitely. This property is only allowed for topologies where "kind" is set to "live".
    segment_length str
    Segment length indicates the length of individual content files (segments) which are persisted to storage. Smaller segments provide lower archive playback latency but generate larger volume of storage transactions. Larger segments reduce the amount of storage transactions while increasing the archive playback latency. Value must be specified in ISO8601 duration format (i.e. "PT30S" equals 30 seconds) and can vary between 30 seconds to 5 minutes, in 30 seconds increments. Changing this value after the initial call to create the video resource can lead to errors when uploading content to the archive. Default value is 30 seconds. This property is only allowed for topologies where "kind" is set to "live".
    title str
    Optional title provided by the user. Value can be up to 256 characters long.
    description String
    Optional description provided by the user. Value can be up to 2048 characters long.
    retentionPeriod String
    Video retention period indicates how long the video is kept in storage. Value must be specified in ISO8601 duration format (i.e. "P1D" equals 1 day) and can vary between 1 day to 10 years, in 1 day increments. When absent (null), all video content is retained indefinitely. This property is only allowed for topologies where "kind" is set to "live".
    segmentLength String
    Segment length indicates the length of individual content files (segments) which are persisted to storage. Smaller segments provide lower archive playback latency but generate larger volume of storage transactions. Larger segments reduce the amount of storage transactions while increasing the archive playback latency. Value must be specified in ISO8601 duration format (i.e. "PT30S" equals 30 seconds) and can vary between 30 seconds to 5 minutes, in 30 seconds increments. Changing this value after the initial call to create the video resource can lead to errors when uploading content to the archive. Default value is 30 seconds. This property is only allowed for topologies where "kind" is set to "live".
    title String
    Optional title provided by the user. Value can be up to 256 characters long.

    VideoEncoderH264, VideoEncoderH264Args

    BitrateKbps string
    The maximum bitrate, in kilobits per second or Kbps, at which video should be encoded. If omitted, encoder sets it automatically to try and match the quality of the input video.
    FrameRate string
    The frame rate (in frames per second) of the encoded video. The value must be greater than zero, and less than or equal to 300. If omitted, the encoder uses the average frame rate of the input video.
    Scale Pulumi.AzureNative.VideoAnalyzer.Inputs.VideoScale
    Describes the resolution of the encoded video. If omitted, the encoder uses the resolution of the input video.
    BitrateKbps string
    The maximum bitrate, in kilobits per second or Kbps, at which video should be encoded. If omitted, encoder sets it automatically to try and match the quality of the input video.
    FrameRate string
    The frame rate (in frames per second) of the encoded video. The value must be greater than zero, and less than or equal to 300. If omitted, the encoder uses the average frame rate of the input video.
    Scale VideoScale
    Describes the resolution of the encoded video. If omitted, the encoder uses the resolution of the input video.
    bitrateKbps String
    The maximum bitrate, in kilobits per second or Kbps, at which video should be encoded. If omitted, encoder sets it automatically to try and match the quality of the input video.
    frameRate String
    The frame rate (in frames per second) of the encoded video. The value must be greater than zero, and less than or equal to 300. If omitted, the encoder uses the average frame rate of the input video.
    scale VideoScale
    Describes the resolution of the encoded video. If omitted, the encoder uses the resolution of the input video.
    bitrateKbps string
    The maximum bitrate, in kilobits per second or Kbps, at which video should be encoded. If omitted, encoder sets it automatically to try and match the quality of the input video.
    frameRate string
    The frame rate (in frames per second) of the encoded video. The value must be greater than zero, and less than or equal to 300. If omitted, the encoder uses the average frame rate of the input video.
    scale VideoScale
    Describes the resolution of the encoded video. If omitted, the encoder uses the resolution of the input video.
    bitrate_kbps str
    The maximum bitrate, in kilobits per second or Kbps, at which video should be encoded. If omitted, encoder sets it automatically to try and match the quality of the input video.
    frame_rate str
    The frame rate (in frames per second) of the encoded video. The value must be greater than zero, and less than or equal to 300. If omitted, the encoder uses the average frame rate of the input video.
    scale VideoScale
    Describes the resolution of the encoded video. If omitted, the encoder uses the resolution of the input video.
    bitrateKbps String
    The maximum bitrate, in kilobits per second or Kbps, at which video should be encoded. If omitted, encoder sets it automatically to try and match the quality of the input video.
    frameRate String
    The frame rate (in frames per second) of the encoded video. The value must be greater than zero, and less than or equal to 300. If omitted, the encoder uses the average frame rate of the input video.
    scale Property Map
    Describes the resolution of the encoded video. If omitted, the encoder uses the resolution of the input video.

    VideoEncoderH264Response, VideoEncoderH264ResponseArgs

    BitrateKbps string
    The maximum bitrate, in kilobits per second or Kbps, at which video should be encoded. If omitted, encoder sets it automatically to try and match the quality of the input video.
    FrameRate string
    The frame rate (in frames per second) of the encoded video. The value must be greater than zero, and less than or equal to 300. If omitted, the encoder uses the average frame rate of the input video.
    Scale Pulumi.AzureNative.VideoAnalyzer.Inputs.VideoScaleResponse
    Describes the resolution of the encoded video. If omitted, the encoder uses the resolution of the input video.
    BitrateKbps string
    The maximum bitrate, in kilobits per second or Kbps, at which video should be encoded. If omitted, encoder sets it automatically to try and match the quality of the input video.
    FrameRate string
    The frame rate (in frames per second) of the encoded video. The value must be greater than zero, and less than or equal to 300. If omitted, the encoder uses the average frame rate of the input video.
    Scale VideoScaleResponse
    Describes the resolution of the encoded video. If omitted, the encoder uses the resolution of the input video.
    bitrateKbps String
    The maximum bitrate, in kilobits per second or Kbps, at which video should be encoded. If omitted, encoder sets it automatically to try and match the quality of the input video.
    frameRate String
    The frame rate (in frames per second) of the encoded video. The value must be greater than zero, and less than or equal to 300. If omitted, the encoder uses the average frame rate of the input video.
    scale VideoScaleResponse
    Describes the resolution of the encoded video. If omitted, the encoder uses the resolution of the input video.
    bitrateKbps string
    The maximum bitrate, in kilobits per second or Kbps, at which video should be encoded. If omitted, encoder sets it automatically to try and match the quality of the input video.
    frameRate string
    The frame rate (in frames per second) of the encoded video. The value must be greater than zero, and less than or equal to 300. If omitted, the encoder uses the average frame rate of the input video.
    scale VideoScaleResponse
    Describes the resolution of the encoded video. If omitted, the encoder uses the resolution of the input video.
    bitrate_kbps str
    The maximum bitrate, in kilobits per second or Kbps, at which video should be encoded. If omitted, encoder sets it automatically to try and match the quality of the input video.
    frame_rate str
    The frame rate (in frames per second) of the encoded video. The value must be greater than zero, and less than or equal to 300. If omitted, the encoder uses the average frame rate of the input video.
    scale VideoScaleResponse
    Describes the resolution of the encoded video. If omitted, the encoder uses the resolution of the input video.
    bitrateKbps String
    The maximum bitrate, in kilobits per second or Kbps, at which video should be encoded. If omitted, encoder sets it automatically to try and match the quality of the input video.
    frameRate String
    The frame rate (in frames per second) of the encoded video. The value must be greater than zero, and less than or equal to 300. If omitted, the encoder uses the average frame rate of the input video.
    scale Property Map
    Describes the resolution of the encoded video. If omitted, the encoder uses the resolution of the input video.

    VideoPublishingOptions, VideoPublishingOptionsArgs

    DisableArchive string
    When set to 'true' content will not be archived or recorded. This is used, for example, when the topology is used only for low latency video streaming. Default is 'false'. If set to 'true', then "disableRtspPublishing" must be set to 'false'.
    DisableRtspPublishing string
    When set to 'true' the RTSP playback URL will not be published, disabling low latency streaming. This is used, for example, when the topology is used only for archiving content. Default is 'false'. If set to 'true', then "disableArchive" must be set to 'false'.
    DisableArchive string
    When set to 'true' content will not be archived or recorded. This is used, for example, when the topology is used only for low latency video streaming. Default is 'false'. If set to 'true', then "disableRtspPublishing" must be set to 'false'.
    DisableRtspPublishing string
    When set to 'true' the RTSP playback URL will not be published, disabling low latency streaming. This is used, for example, when the topology is used only for archiving content. Default is 'false'. If set to 'true', then "disableArchive" must be set to 'false'.
    disableArchive String
    When set to 'true' content will not be archived or recorded. This is used, for example, when the topology is used only for low latency video streaming. Default is 'false'. If set to 'true', then "disableRtspPublishing" must be set to 'false'.
    disableRtspPublishing String
    When set to 'true' the RTSP playback URL will not be published, disabling low latency streaming. This is used, for example, when the topology is used only for archiving content. Default is 'false'. If set to 'true', then "disableArchive" must be set to 'false'.
    disableArchive string
    When set to 'true' content will not be archived or recorded. This is used, for example, when the topology is used only for low latency video streaming. Default is 'false'. If set to 'true', then "disableRtspPublishing" must be set to 'false'.
    disableRtspPublishing string
    When set to 'true' the RTSP playback URL will not be published, disabling low latency streaming. This is used, for example, when the topology is used only for archiving content. Default is 'false'. If set to 'true', then "disableArchive" must be set to 'false'.
    disable_archive str
    When set to 'true' content will not be archived or recorded. This is used, for example, when the topology is used only for low latency video streaming. Default is 'false'. If set to 'true', then "disableRtspPublishing" must be set to 'false'.
    disable_rtsp_publishing str
    When set to 'true' the RTSP playback URL will not be published, disabling low latency streaming. This is used, for example, when the topology is used only for archiving content. Default is 'false'. If set to 'true', then "disableArchive" must be set to 'false'.
    disableArchive String
    When set to 'true' content will not be archived or recorded. This is used, for example, when the topology is used only for low latency video streaming. Default is 'false'. If set to 'true', then "disableRtspPublishing" must be set to 'false'.
    disableRtspPublishing String
    When set to 'true' the RTSP playback URL will not be published, disabling low latency streaming. This is used, for example, when the topology is used only for archiving content. Default is 'false'. If set to 'true', then "disableArchive" must be set to 'false'.

    VideoPublishingOptionsResponse, VideoPublishingOptionsResponseArgs

    DisableArchive string
    When set to 'true' content will not be archived or recorded. This is used, for example, when the topology is used only for low latency video streaming. Default is 'false'. If set to 'true', then "disableRtspPublishing" must be set to 'false'.
    DisableRtspPublishing string
    When set to 'true' the RTSP playback URL will not be published, disabling low latency streaming. This is used, for example, when the topology is used only for archiving content. Default is 'false'. If set to 'true', then "disableArchive" must be set to 'false'.
    DisableArchive string
    When set to 'true' content will not be archived or recorded. This is used, for example, when the topology is used only for low latency video streaming. Default is 'false'. If set to 'true', then "disableRtspPublishing" must be set to 'false'.
    DisableRtspPublishing string
    When set to 'true' the RTSP playback URL will not be published, disabling low latency streaming. This is used, for example, when the topology is used only for archiving content. Default is 'false'. If set to 'true', then "disableArchive" must be set to 'false'.
    disableArchive String
    When set to 'true' content will not be archived or recorded. This is used, for example, when the topology is used only for low latency video streaming. Default is 'false'. If set to 'true', then "disableRtspPublishing" must be set to 'false'.
    disableRtspPublishing String
    When set to 'true' the RTSP playback URL will not be published, disabling low latency streaming. This is used, for example, when the topology is used only for archiving content. Default is 'false'. If set to 'true', then "disableArchive" must be set to 'false'.
    disableArchive string
    When set to 'true' content will not be archived or recorded. This is used, for example, when the topology is used only for low latency video streaming. Default is 'false'. If set to 'true', then "disableRtspPublishing" must be set to 'false'.
    disableRtspPublishing string
    When set to 'true' the RTSP playback URL will not be published, disabling low latency streaming. This is used, for example, when the topology is used only for archiving content. Default is 'false'. If set to 'true', then "disableArchive" must be set to 'false'.
    disable_archive str
    When set to 'true' content will not be archived or recorded. This is used, for example, when the topology is used only for low latency video streaming. Default is 'false'. If set to 'true', then "disableRtspPublishing" must be set to 'false'.
    disable_rtsp_publishing str
    When set to 'true' the RTSP playback URL will not be published, disabling low latency streaming. This is used, for example, when the topology is used only for archiving content. Default is 'false'. If set to 'true', then "disableArchive" must be set to 'false'.
    disableArchive String
    When set to 'true' content will not be archived or recorded. This is used, for example, when the topology is used only for low latency video streaming. Default is 'false'. If set to 'true', then "disableRtspPublishing" must be set to 'false'.
    disableRtspPublishing String
    When set to 'true' the RTSP playback URL will not be published, disabling low latency streaming. This is used, for example, when the topology is used only for archiving content. Default is 'false'. If set to 'true', then "disableArchive" must be set to 'false'.

    VideoScale, VideoScaleArgs

    Height string
    The desired output video height.
    Mode string | Pulumi.AzureNative.VideoAnalyzer.VideoScaleMode
    Describes the video scaling mode to be applied. Default mode is 'Pad'. If the mode is 'Pad' or 'Stretch' then both width and height must be specified. Else if the mode is 'PreserveAspectRatio' then only one of width or height need be provided.
    Width string
    The desired output video width.
    Height string
    The desired output video height.
    Mode string | VideoScaleMode
    Describes the video scaling mode to be applied. Default mode is 'Pad'. If the mode is 'Pad' or 'Stretch' then both width and height must be specified. Else if the mode is 'PreserveAspectRatio' then only one of width or height need be provided.
    Width string
    The desired output video width.
    height String
    The desired output video height.
    mode String | VideoScaleMode
    Describes the video scaling mode to be applied. Default mode is 'Pad'. If the mode is 'Pad' or 'Stretch' then both width and height must be specified. Else if the mode is 'PreserveAspectRatio' then only one of width or height need be provided.
    width String
    The desired output video width.
    height string
    The desired output video height.
    mode string | VideoScaleMode
    Describes the video scaling mode to be applied. Default mode is 'Pad'. If the mode is 'Pad' or 'Stretch' then both width and height must be specified. Else if the mode is 'PreserveAspectRatio' then only one of width or height need be provided.
    width string
    The desired output video width.
    height str
    The desired output video height.
    mode str | VideoScaleMode
    Describes the video scaling mode to be applied. Default mode is 'Pad'. If the mode is 'Pad' or 'Stretch' then both width and height must be specified. Else if the mode is 'PreserveAspectRatio' then only one of width or height need be provided.
    width str
    The desired output video width.
    height String
    The desired output video height.
    mode String | "Pad" | "PreserveAspectRatio" | "Stretch"
    Describes the video scaling mode to be applied. Default mode is 'Pad'. If the mode is 'Pad' or 'Stretch' then both width and height must be specified. Else if the mode is 'PreserveAspectRatio' then only one of width or height need be provided.
    width String
    The desired output video width.

    VideoScaleMode, VideoScaleModeArgs

    Pad
    PadPads the video with black horizontal stripes (letterbox) or black vertical stripes (pillar-box) so the video is resized to the specified dimensions while not altering the content aspect ratio.
    PreserveAspectRatio
    PreserveAspectRatioPreserves the same aspect ratio as the input video. If only one video dimension is provided, the second dimension is calculated based on the input video aspect ratio. When 2 dimensions are provided, the video is resized to fit the most constraining dimension, considering the input video size and aspect ratio.
    Stretch
    StretchStretches the original video so it resized to the specified dimensions.
    VideoScaleModePad
    PadPads the video with black horizontal stripes (letterbox) or black vertical stripes (pillar-box) so the video is resized to the specified dimensions while not altering the content aspect ratio.
    VideoScaleModePreserveAspectRatio
    PreserveAspectRatioPreserves the same aspect ratio as the input video. If only one video dimension is provided, the second dimension is calculated based on the input video aspect ratio. When 2 dimensions are provided, the video is resized to fit the most constraining dimension, considering the input video size and aspect ratio.
    VideoScaleModeStretch
    StretchStretches the original video so it resized to the specified dimensions.
    Pad
    PadPads the video with black horizontal stripes (letterbox) or black vertical stripes (pillar-box) so the video is resized to the specified dimensions while not altering the content aspect ratio.
    PreserveAspectRatio
    PreserveAspectRatioPreserves the same aspect ratio as the input video. If only one video dimension is provided, the second dimension is calculated based on the input video aspect ratio. When 2 dimensions are provided, the video is resized to fit the most constraining dimension, considering the input video size and aspect ratio.
    Stretch
    StretchStretches the original video so it resized to the specified dimensions.
    Pad
    PadPads the video with black horizontal stripes (letterbox) or black vertical stripes (pillar-box) so the video is resized to the specified dimensions while not altering the content aspect ratio.
    PreserveAspectRatio
    PreserveAspectRatioPreserves the same aspect ratio as the input video. If only one video dimension is provided, the second dimension is calculated based on the input video aspect ratio. When 2 dimensions are provided, the video is resized to fit the most constraining dimension, considering the input video size and aspect ratio.
    Stretch
    StretchStretches the original video so it resized to the specified dimensions.
    PAD
    PadPads the video with black horizontal stripes (letterbox) or black vertical stripes (pillar-box) so the video is resized to the specified dimensions while not altering the content aspect ratio.
    PRESERVE_ASPECT_RATIO
    PreserveAspectRatioPreserves the same aspect ratio as the input video. If only one video dimension is provided, the second dimension is calculated based on the input video aspect ratio. When 2 dimensions are provided, the video is resized to fit the most constraining dimension, considering the input video size and aspect ratio.
    STRETCH
    StretchStretches the original video so it resized to the specified dimensions.
    "Pad"
    PadPads the video with black horizontal stripes (letterbox) or black vertical stripes (pillar-box) so the video is resized to the specified dimensions while not altering the content aspect ratio.
    "PreserveAspectRatio"
    PreserveAspectRatioPreserves the same aspect ratio as the input video. If only one video dimension is provided, the second dimension is calculated based on the input video aspect ratio. When 2 dimensions are provided, the video is resized to fit the most constraining dimension, considering the input video size and aspect ratio.
    "Stretch"
    StretchStretches the original video so it resized to the specified dimensions.

    VideoScaleResponse, VideoScaleResponseArgs

    Height string
    The desired output video height.
    Mode string
    Describes the video scaling mode to be applied. Default mode is 'Pad'. If the mode is 'Pad' or 'Stretch' then both width and height must be specified. Else if the mode is 'PreserveAspectRatio' then only one of width or height need be provided.
    Width string
    The desired output video width.
    Height string
    The desired output video height.
    Mode string
    Describes the video scaling mode to be applied. Default mode is 'Pad'. If the mode is 'Pad' or 'Stretch' then both width and height must be specified. Else if the mode is 'PreserveAspectRatio' then only one of width or height need be provided.
    Width string
    The desired output video width.
    height String
    The desired output video height.
    mode String
    Describes the video scaling mode to be applied. Default mode is 'Pad'. If the mode is 'Pad' or 'Stretch' then both width and height must be specified. Else if the mode is 'PreserveAspectRatio' then only one of width or height need be provided.
    width String
    The desired output video width.
    height string
    The desired output video height.
    mode string
    Describes the video scaling mode to be applied. Default mode is 'Pad'. If the mode is 'Pad' or 'Stretch' then both width and height must be specified. Else if the mode is 'PreserveAspectRatio' then only one of width or height need be provided.
    width string
    The desired output video width.
    height str
    The desired output video height.
    mode str
    Describes the video scaling mode to be applied. Default mode is 'Pad'. If the mode is 'Pad' or 'Stretch' then both width and height must be specified. Else if the mode is 'PreserveAspectRatio' then only one of width or height need be provided.
    width str
    The desired output video width.
    height String
    The desired output video height.
    mode String
    Describes the video scaling mode to be applied. Default mode is 'Pad'. If the mode is 'Pad' or 'Stretch' then both width and height must be specified. Else if the mode is 'PreserveAspectRatio' then only one of width or height need be provided.
    width String
    The desired output video width.

    VideoSequenceAbsoluteTimeMarkers, VideoSequenceAbsoluteTimeMarkersArgs

    Ranges string
    The sequence of datetime ranges. Example: '[["2021-10-05T03:30:00Z", "2021-10-05T03:40:00Z"]]'.
    Ranges string
    The sequence of datetime ranges. Example: '[["2021-10-05T03:30:00Z", "2021-10-05T03:40:00Z"]]'.
    ranges String
    The sequence of datetime ranges. Example: '[["2021-10-05T03:30:00Z", "2021-10-05T03:40:00Z"]]'.
    ranges string
    The sequence of datetime ranges. Example: '[["2021-10-05T03:30:00Z", "2021-10-05T03:40:00Z"]]'.
    ranges str
    The sequence of datetime ranges. Example: '[["2021-10-05T03:30:00Z", "2021-10-05T03:40:00Z"]]'.
    ranges String
    The sequence of datetime ranges. Example: '[["2021-10-05T03:30:00Z", "2021-10-05T03:40:00Z"]]'.

    VideoSequenceAbsoluteTimeMarkersResponse, VideoSequenceAbsoluteTimeMarkersResponseArgs

    Ranges string
    The sequence of datetime ranges. Example: '[["2021-10-05T03:30:00Z", "2021-10-05T03:40:00Z"]]'.
    Ranges string
    The sequence of datetime ranges. Example: '[["2021-10-05T03:30:00Z", "2021-10-05T03:40:00Z"]]'.
    ranges String
    The sequence of datetime ranges. Example: '[["2021-10-05T03:30:00Z", "2021-10-05T03:40:00Z"]]'.
    ranges string
    The sequence of datetime ranges. Example: '[["2021-10-05T03:30:00Z", "2021-10-05T03:40:00Z"]]'.
    ranges str
    The sequence of datetime ranges. Example: '[["2021-10-05T03:30:00Z", "2021-10-05T03:40:00Z"]]'.
    ranges String
    The sequence of datetime ranges. Example: '[["2021-10-05T03:30:00Z", "2021-10-05T03:40:00Z"]]'.

    VideoSink, VideoSinkArgs

    Inputs List<Pulumi.AzureNative.VideoAnalyzer.Inputs.NodeInput>
    An array of upstream node references within the topology to be used as inputs for this node.
    Name string
    Node name. Must be unique within the topology.
    VideoName string
    Name of a new or existing video resource used to capture and publish content. Note: if downstream of RTSP source, and if disableArchive is set to true, then no content is archived.
    VideoCreationProperties Pulumi.AzureNative.VideoAnalyzer.Inputs.VideoCreationProperties
    Optional video properties to be used in case a new video resource needs to be created on the service.
    VideoPublishingOptions Pulumi.AzureNative.VideoAnalyzer.Inputs.VideoPublishingOptions
    Options to change how the video sink publishes content via the video resource. This property is only allowed for topologies where "kind" is set to "live".
    Inputs []NodeInput
    An array of upstream node references within the topology to be used as inputs for this node.
    Name string
    Node name. Must be unique within the topology.
    VideoName string
    Name of a new or existing video resource used to capture and publish content. Note: if downstream of RTSP source, and if disableArchive is set to true, then no content is archived.
    VideoCreationProperties VideoCreationProperties
    Optional video properties to be used in case a new video resource needs to be created on the service.
    VideoPublishingOptions VideoPublishingOptions
    Options to change how the video sink publishes content via the video resource. This property is only allowed for topologies where "kind" is set to "live".
    inputs List<NodeInput>
    An array of upstream node references within the topology to be used as inputs for this node.
    name String
    Node name. Must be unique within the topology.
    videoName String
    Name of a new or existing video resource used to capture and publish content. Note: if downstream of RTSP source, and if disableArchive is set to true, then no content is archived.
    videoCreationProperties VideoCreationProperties
    Optional video properties to be used in case a new video resource needs to be created on the service.
    videoPublishingOptions VideoPublishingOptions
    Options to change how the video sink publishes content via the video resource. This property is only allowed for topologies where "kind" is set to "live".
    inputs NodeInput[]
    An array of upstream node references within the topology to be used as inputs for this node.
    name string
    Node name. Must be unique within the topology.
    videoName string
    Name of a new or existing video resource used to capture and publish content. Note: if downstream of RTSP source, and if disableArchive is set to true, then no content is archived.
    videoCreationProperties VideoCreationProperties
    Optional video properties to be used in case a new video resource needs to be created on the service.
    videoPublishingOptions VideoPublishingOptions
    Options to change how the video sink publishes content via the video resource. This property is only allowed for topologies where "kind" is set to "live".
    inputs Sequence[NodeInput]
    An array of upstream node references within the topology to be used as inputs for this node.
    name str
    Node name. Must be unique within the topology.
    video_name str
    Name of a new or existing video resource used to capture and publish content. Note: if downstream of RTSP source, and if disableArchive is set to true, then no content is archived.
    video_creation_properties VideoCreationProperties
    Optional video properties to be used in case a new video resource needs to be created on the service.
    video_publishing_options VideoPublishingOptions
    Options to change how the video sink publishes content via the video resource. This property is only allowed for topologies where "kind" is set to "live".
    inputs List<Property Map>
    An array of upstream node references within the topology to be used as inputs for this node.
    name String
    Node name. Must be unique within the topology.
    videoName String
    Name of a new or existing video resource used to capture and publish content. Note: if downstream of RTSP source, and if disableArchive is set to true, then no content is archived.
    videoCreationProperties Property Map
    Optional video properties to be used in case a new video resource needs to be created on the service.
    videoPublishingOptions Property Map
    Options to change how the video sink publishes content via the video resource. This property is only allowed for topologies where "kind" is set to "live".

    VideoSinkResponse, VideoSinkResponseArgs

    Inputs List<Pulumi.AzureNative.VideoAnalyzer.Inputs.NodeInputResponse>
    An array of upstream node references within the topology to be used as inputs for this node.
    Name string
    Node name. Must be unique within the topology.
    VideoName string
    Name of a new or existing video resource used to capture and publish content. Note: if downstream of RTSP source, and if disableArchive is set to true, then no content is archived.
    VideoCreationProperties Pulumi.AzureNative.VideoAnalyzer.Inputs.VideoCreationPropertiesResponse
    Optional video properties to be used in case a new video resource needs to be created on the service.
    VideoPublishingOptions Pulumi.AzureNative.VideoAnalyzer.Inputs.VideoPublishingOptionsResponse
    Options to change how the video sink publishes content via the video resource. This property is only allowed for topologies where "kind" is set to "live".
    Inputs []NodeInputResponse
    An array of upstream node references within the topology to be used as inputs for this node.
    Name string
    Node name. Must be unique within the topology.
    VideoName string
    Name of a new or existing video resource used to capture and publish content. Note: if downstream of RTSP source, and if disableArchive is set to true, then no content is archived.
    VideoCreationProperties VideoCreationPropertiesResponse
    Optional video properties to be used in case a new video resource needs to be created on the service.
    VideoPublishingOptions VideoPublishingOptionsResponse
    Options to change how the video sink publishes content via the video resource. This property is only allowed for topologies where "kind" is set to "live".
    inputs List<NodeInputResponse>
    An array of upstream node references within the topology to be used as inputs for this node.
    name String
    Node name. Must be unique within the topology.
    videoName String
    Name of a new or existing video resource used to capture and publish content. Note: if downstream of RTSP source, and if disableArchive is set to true, then no content is archived.
    videoCreationProperties VideoCreationPropertiesResponse
    Optional video properties to be used in case a new video resource needs to be created on the service.
    videoPublishingOptions VideoPublishingOptionsResponse
    Options to change how the video sink publishes content via the video resource. This property is only allowed for topologies where "kind" is set to "live".
    inputs NodeInputResponse[]
    An array of upstream node references within the topology to be used as inputs for this node.
    name string
    Node name. Must be unique within the topology.
    videoName string
    Name of a new or existing video resource used to capture and publish content. Note: if downstream of RTSP source, and if disableArchive is set to true, then no content is archived.
    videoCreationProperties VideoCreationPropertiesResponse
    Optional video properties to be used in case a new video resource needs to be created on the service.
    videoPublishingOptions VideoPublishingOptionsResponse
    Options to change how the video sink publishes content via the video resource. This property is only allowed for topologies where "kind" is set to "live".
    inputs Sequence[NodeInputResponse]
    An array of upstream node references within the topology to be used as inputs for this node.
    name str
    Node name. Must be unique within the topology.
    video_name str
    Name of a new or existing video resource used to capture and publish content. Note: if downstream of RTSP source, and if disableArchive is set to true, then no content is archived.
    video_creation_properties VideoCreationPropertiesResponse
    Optional video properties to be used in case a new video resource needs to be created on the service.
    video_publishing_options VideoPublishingOptionsResponse
    Options to change how the video sink publishes content via the video resource. This property is only allowed for topologies where "kind" is set to "live".
    inputs List<Property Map>
    An array of upstream node references within the topology to be used as inputs for this node.
    name String
    Node name. Must be unique within the topology.
    videoName String
    Name of a new or existing video resource used to capture and publish content. Note: if downstream of RTSP source, and if disableArchive is set to true, then no content is archived.
    videoCreationProperties Property Map
    Optional video properties to be used in case a new video resource needs to be created on the service.
    videoPublishingOptions Property Map
    Options to change how the video sink publishes content via the video resource. This property is only allowed for topologies where "kind" is set to "live".

    VideoSource, VideoSourceArgs

    Name string
    Node name. Must be unique within the topology.
    TimeSequences Pulumi.AzureNative.VideoAnalyzer.Inputs.VideoSequenceAbsoluteTimeMarkers
    Describes a sequence of datetime ranges. The video source only picks up recorded media within these ranges.
    VideoName string
    Name of the Video Analyzer video resource to be used as the source.
    Name string
    Node name. Must be unique within the topology.
    TimeSequences VideoSequenceAbsoluteTimeMarkers
    Describes a sequence of datetime ranges. The video source only picks up recorded media within these ranges.
    VideoName string
    Name of the Video Analyzer video resource to be used as the source.
    name String
    Node name. Must be unique within the topology.
    timeSequences VideoSequenceAbsoluteTimeMarkers
    Describes a sequence of datetime ranges. The video source only picks up recorded media within these ranges.
    videoName String
    Name of the Video Analyzer video resource to be used as the source.
    name string
    Node name. Must be unique within the topology.
    timeSequences VideoSequenceAbsoluteTimeMarkers
    Describes a sequence of datetime ranges. The video source only picks up recorded media within these ranges.
    videoName string
    Name of the Video Analyzer video resource to be used as the source.
    name str
    Node name. Must be unique within the topology.
    time_sequences VideoSequenceAbsoluteTimeMarkers
    Describes a sequence of datetime ranges. The video source only picks up recorded media within these ranges.
    video_name str
    Name of the Video Analyzer video resource to be used as the source.
    name String
    Node name. Must be unique within the topology.
    timeSequences Property Map
    Describes a sequence of datetime ranges. The video source only picks up recorded media within these ranges.
    videoName String
    Name of the Video Analyzer video resource to be used as the source.

    VideoSourceResponse, VideoSourceResponseArgs

    Name string
    Node name. Must be unique within the topology.
    TimeSequences Pulumi.AzureNative.VideoAnalyzer.Inputs.VideoSequenceAbsoluteTimeMarkersResponse
    Describes a sequence of datetime ranges. The video source only picks up recorded media within these ranges.
    VideoName string
    Name of the Video Analyzer video resource to be used as the source.
    Name string
    Node name. Must be unique within the topology.
    TimeSequences VideoSequenceAbsoluteTimeMarkersResponse
    Describes a sequence of datetime ranges. The video source only picks up recorded media within these ranges.
    VideoName string
    Name of the Video Analyzer video resource to be used as the source.
    name String
    Node name. Must be unique within the topology.
    timeSequences VideoSequenceAbsoluteTimeMarkersResponse
    Describes a sequence of datetime ranges. The video source only picks up recorded media within these ranges.
    videoName String
    Name of the Video Analyzer video resource to be used as the source.
    name string
    Node name. Must be unique within the topology.
    timeSequences VideoSequenceAbsoluteTimeMarkersResponse
    Describes a sequence of datetime ranges. The video source only picks up recorded media within these ranges.
    videoName string
    Name of the Video Analyzer video resource to be used as the source.
    name str
    Node name. Must be unique within the topology.
    time_sequences VideoSequenceAbsoluteTimeMarkersResponse
    Describes a sequence of datetime ranges. The video source only picks up recorded media within these ranges.
    video_name str
    Name of the Video Analyzer video resource to be used as the source.
    name String
    Node name. Must be unique within the topology.
    timeSequences Property Map
    Describes a sequence of datetime ranges. The video source only picks up recorded media within these ranges.
    videoName String
    Name of the Video Analyzer video resource to be used as the source.

    Import

    An existing resource can be imported using its type token, name, and identifier, e.g.

    $ pulumi import azure-native:videoanalyzer:PipelineTopology pipelineTopology1 /subscriptions/591e76c3-3e97-44db-879c-3e2b12961b62/resourceGroups/testrg/providers/Microsoft.Media/videoAnalyzers/testaccount2/pipelineTopologies/pipelineTopology1 
    

    To learn more about importing existing cloud resources, see Importing resources.

    Package Details

    Repository
    azure-native-v1 pulumi/pulumi-azure-native
    License
    Apache-2.0
    azure-native-v1 logo
    These are the docs for Azure Native v1. We recommenend using the latest version, Azure Native v2.
    Azure Native v1 v1.104.0 published on Thursday, Jul 6, 2023 by Pulumi