+ Google Cloud Tools for PowerShell is a set of cmdlets for Windows
+ PowerShell that lets you manage Google Cloud Platform resources.
+
+
+
+
+
+
+
+ Google Compute Engine lets you create and run virtual machines on Google
+ infrastructure. Compute Engine offers scale, performance, and value that
+ allows you to easily launch large compute clusters on Google's
+ infrastructure. There are no upfront investments and you can run
+ thousands of virtual CPUs on a system that has been designed to be fast,
+ and to offer strong consistency of performance.
+
+
+
Instances
+
+ Google Compute Engine VMs are referred to as instances. To create
+ an instance, you must first create an instance configuration.
+ This requires at the minimum a name, a machine type, and a boot disk
+ image or preexisting boot disk.
+
+
+ Once you have your configuration object, you can send them to the
+ cmdlet to build them of a
+ particular project and zone. If your active gcloud configuration has a
+ project and zone, then those parameters are optional.
+
+
+# Define the project name you want to create the instance in. If not set, the
+# command will use the current default project specified by gcloud config.
+$project = "<your-project-name>"
+
+# Define the configuration for an instance called "webserver-1"
+$config = New-GceInstanceConfig "webserver-1" -MachineType "n1-standard-4" `
+ -DiskImage (Get-GceImage -Family "windows-2012-r2")
+
+# Attempt to create the instance based on the configuration
+$config | Add-GceInstance -Project $project -Zone "us-central1-b"
+
Management
+
+ In addition to basic cmdlets to start, stop, or restart an instance,
+ You can set tags, disks, access configs or metadata after creating
+ your VM with the cmdlet.
+
+
+$instance = "<your-instance-name>"
+
+# Fetch information about the instance
+Get-GceInstance $instance
+
+# Stop, start, restart the instance
+Stop-GceInstance $instance
+Start-GceInstance $instance
+Restart-GceInstance $instance
+
+# Add a new network access configuration to the instance
+[Google.Apis.Compute.v1.Data.AccessConfig] $newConfig = @{}
+$newConfig.Kind = "ONE_TO_ONE_NAT"
+$newConfig.Name = "New NAT"
+
+Set-GceInstance "instance-name" `
+ -NetworkInterface "nic0"
+ -RemoveAccessConfig "External NAT"
+ -NewAccessConfig $newConfig
+
+# Edit the metadata and tags on the instance
+Set-GceInstance "instance-name" -AddMetadata @{"newKey" = "newValue"}
+Set-GceInstance "instance-name" -RemoveMetadata "newKey"
+Set-GceInstance "instance-name" -RemoveTag "beta" -AddTag "alpha"
+
+
+
+
+
+ Google Container Engine is a powerful cluster manager and orchestration
+ system for running your Docker containers. Container Engine schedules
+ your containers into the cluster and manages them automatically based
+ on the requirements you define (such as CPU and memory). It's built
+ on the open source Kubernetes system, giving you the flexibility to take
+ advantage of on-premises, hybrid, or public cloud infrastructure.
+
+
+
Container Clusters
+
+ You can create a cluster by first creating a NodeConfig
+ object with cmdlet.
+ After that, you can pass in the NodeConfig
object to the
+ cmdlet. It will then create
+ a cluster whose node pools will have their configurations set from the
+ NodeConfig
object.
+
+
# Creates a Container Engine Node Config with image type CONTAINER_VM
+# and 20 GB disk size for each node.
+$nodeConfig = New-GkeNodeConfig -DiskSizeGb 20 `
+ -ImageType CONTAINER_VM
+
+# Creates a cluster named "my-cluster" in the default zone of the
+# default project using config $nodeConfig and network "my-network".
+Add-GkeCluster -NodeConfig $nodeConfig `
+ -ClusterName "my-cluster" `
+ -Network "my-network"
+
+ Instead of passing in the NodeConfig
object, you can also
+ use the parameters provided in the
+ cmdlet to create a cluster (a NodeConfig
object will
+ be created internally by the cmdlet).
+
+
# Creates a cluster named "my-cluster" with description "my new cluster"
+# in the default zone of the default project using machine type
+# "n1-standard-4" for each Google Compute Engine in the cluster.
+# The cluster will use the subnetwork "my-subnetwork".
+# The cluster's nodes will have autoupgrade enabled.
+# The cluster will also autoscale its node pool to a maximum of 2 nodes.
+Add-GkeCluster -MachineType "n1-standard-4" `
+ -ClusterName "my-cluster" `
+ -Description "My new cluster" `
+ -Subnetwork "my-subnetwork" `
+ -EnableAutoUpgrade `
+ -MaximumNodesToScaleTo 2
+
+ You can update a cluster with
+ cmdlet. Only one property of the cluster can be updated at a time.
+
+
# Sets additional zones of cluster "my-cluster" in zone "asia-east1-a"
+# to zones "asia-east1-b" and "asia-east1-c". This means the clusters will
+# have nodes created in these zones. The primary zone
+# ("asia-east1-a" in this case) will be added to the
+# AdditionalZone
array by the cmdlet.
+Set-GkeCluster -ClusterName "my-cluster" `
+ -Zone "asia-east1-a" `
+ -AdditionalZone "asia-east1-b", "asia-east1-c"
+
+ You can list available clusters with
+ cmdlet.
+
+
# Lists all container clusters in the default project.
+Get-GkeCluster
+
+# List all container clusters in zone "us-central1-a"
+# of the default project.
+Get-GkeCluster -Zone "us-central1-a"
+
+ You can remove a cluster with
+ cmdlet.
+
+
# Removes the cluster "my-cluster" in zone "us-west1-b"
+# of the default project.
+Remove-GkeCluster -ClusterName "my-cluster" `
+ -Zone "us-west1-b"
+
+
Node Pools
+
+ A node pool is a subset of machines within a cluster that all have the
+ same configuration. While all nodes in a container cluster are identical,
+ node pools let you create pools of machines within your cluster that have
+ different configurations. For example, you might create a pool of nodes
+ in your cluster that have local SSDs or larger instance sizes. Because
+ of this, node pools are useful for customizing the instance profile
+ in your cluster.
+
+
+ To add a node pool to your cluster, you can first create a
+ NodePool
object with
+ cmdlet. You can then call cmdlet
+ to add the NodePool
object to a cluster.
+
+
# Creates a node pool named "my-nodepool" with image type
+# CONTAINER_VM for each node.
+$nodePool = New-GkeNodePool -NodePoolName "my-nodepool" `
+ -ImageType CONTAINER_VM
+
+# Adds the pool to cluster "my-cluster".
+Add-GkeNodePool -NodePool $nodePool -Cluster "my-cluster"
+
+ You can list all the node pools in a cluster with
+ cmdlet.
+
+
# Lists all node pools in cluster "my-cluster" in the default project.
+Get-GkeNodePool -ClusterName "my-cluster"
+
+ You can remove a node pool from a cluster with
+ cmdlet.
+
+
# Removes the node pool "my-nodepool" in cluster "my-cluster"
+# in the zone "us-west1-b" of the default project.
+Remove-GkeCluster -ClusterName "my-cluster" `
+ -Zone "us-west1-b" `
+ -NodePoolName "my-nodepool"
+
+
+
+
+
+ Google Cloud Storage allows world-wide storage and retrieval of any
+ amount of data at any time. You can use Google Cloud Storage for a range
+ of scenarios including serving website content, storing data for archival
+ and disaster recovery, or distributing large data objects to users via
+ direct download.
+
+
Buckets
+
+ Google Cloud Storage data is grouped into "buckets".
+
+
+# List all buckets associated with a project
+$project = "<your-project-name>"
+Get-GcsBucket -Project $project
+
+# Create a new bucket in the project
+New-GcsBucket -Project $project -Name "<your-bucket-name>"
+
Objects
+
+ Each bucket contains "objects", which contain arbitrary data.
+
+
+$bucket = "<your-bucket-name>"
+
+# List all objects in a GCS bucket.
+Get-GcsObject -Bucket $bucket
+
+# Upload a file to the bucket in a "test" folder,
+# renames it in passing from "test-file.png" to "test.png"
+# NOTE: This will fail unless you have permissions to write in the bucket.
+Write-GcsObject -Bucket $bucket -File "test-file.png" -ObjectName "test/test.png"
+
+# Download a GCS object to disk.
+Read-GcsObject $bucket "object-name" -OutFile "output-file.png"
+
Cloud Storage PowerShell Provider
+
+ Cloud Tools for PowerShell includes a PowerShell provider for Google Cloud Storage.
+ This provider allows you to use commands like cd
, dir
, copy
and del
to navigate
+ and manipulate your data in Cloud Storage as if the data were on a local file system.
+
+ To directly use the provider, you can start Cloud Tools for PowerShell using the shortcut
+ from the start menu. This will launch a PowerShell console with the provider loaded:
+
+
+# Navigate to Google Cloud Storage
+cd gs:\
+
+# Show the available buckets
+dir
+
+# Create a new bucket
+mkdir my-new-bucket
+
+ You can also make the provider available in any PowerShell session by importing
+ the Cloud Tools for PowerShell module via Import-Module GoogleCloud
.
+
+
+
+
+
+
+ Google BigQuery is a versatile tool that solves the problem of storing and querying massive
+ datasets without having to worry about data formats, underlying resources, and other things
+ that distract you from your analysis.
+
+
+
Dataset
+
+ To use BigQuery in a Cloud project, first create a Dataset
using the
+ cmdlet. This will take in basic information and
+ create the resource server-side. Locally, a Dataset
reference object is returned.
+ To get a reference object for an existing dataset, use .
+
+
+# Makes a new dataset with DatasetId "page_views".
+$dataset = New-BqDataset "page_views" -Name "Page Views" `
+ -Description "Page views from 2014 onwards"
+
+# Two ways to get a Dataset: by DatasetId and by Dataset object.
+$dataset = Get-BqDataset "page_views"
+$dataset = $dataset | Get-BqDataset
+
+
+ This object $dataset
can be modified and passed into further cmdlets such as
+ to manipulate cloud resources. This cmdlet also
+ handles adding and removing labels with -SetLabel
and -ClearLabel
.
+
+
+ Labels are used to tag datasets with keywords and/or values so they can be filtered and searched later.
+ The Get-BqDataset
cmdlet has a built in -Filter
flag that allows fine grained
+ control when listing datasets for processing with other cmdlets.
+
+
+# Updates the Name field of $dataset.
+$dataset.Name = "PageView Data"
+$dataset = Set-BqDataset "page_views"
+
+# Adds the labels 'department' and 'purpose' to $dataset.
+$dataset = Set-BqDataset "page_views" -SetLabel `
+ @{"department" = "statistics"; "purpose" = "metrics"}
+
+# Filter Datasets by the department label.
+$stats = Get-BqDataset -Filter "department:statistics"
+
+
+ Datasets
can be deleted by the cmdlet. This
+ cmdlet supports ShouldProcess (the -WhatIf
parameter) and will prompt for user
+ confirmation before deleting a non-empty Dataset
. This safeguard can be bypassed with the
+ -Force
parameter when scripting.
+
+
+# Deletes $dataset.
+$dataset | Remove-BqDataset
+
+
+
Table
+
+ Each Dataset
has a number of Tables
to hold data. Tables
are
+ created with the cmdlet by passing in a TableId
+ and the Dataset
where the table will reside. The Dataset
can be passed in by object
+ or with the -DatasetId
parameter. and
+ work the same way as the Get-
and
+ Set-
dataset cmdlets above.
+
+
+# Creates a new table in the dataset from above.
+$table = $dataset | New-BqTable "logs2014" `
+ -Description "Log data from Jan 2014 to Dec 2014 inclusive"
+
+# Gets a reference object for "page_views:logs2014".
+$table = Get-BqTable "logs2014" -DatasetId "page_views"
+
+# Modifies the Name attribute of logs2014.
+$table.Name = "Logs 2014"
+$table = $table | Set-BqTable
+
+
+ Tables
can be deleted by the cmdlet. This
+ cmdlet supports ShouldProcess (the -WhatIf
parameter) and will prompt for user
+ confirmation before deleting a Table
that contains data. This safeguard can be bypassed with
+ the -Force
parameter.
+
+
+# Deletes $table.
+$table | Remove-BqTable -Force
+
+
+
Schema
+
+ Tables
need Schemas
to describe the format of the data they contain. Schemas are created
+ with the and
+ cmdlets. New-BqSchema
can take the formats for rows as parameters directly or as a JSON array
+ of row descriptions. The results of New-BqSchema
are always passed into Set-BqSchema
+ which can either output a Schema
object or assign the schema to an existing Table
.
+
+
+# Assigns a Schema to $table
+$table = Get-BqTable "logs2014" -DatasetId "page_views"
+New-BqSchema "Page" "STRING" | New-BqSchema "Referrer" "STRING" |
+ New-BqSchema "Timestamp" "DATETIME" | Set-BqSchema $table
+
+# Creates a schema object to be used in multiple tables.
+$schema = New-BqSchema "Page" "STRING" | New-BqSchema "Referrer" "STRING" |
+ New-BqSchema "Timestamp" "DATETIME" | Set-BqSchema
+
+
+ Schema
objects can be passed as parameters in Table
creation if they are created
+ ahead of time.
+
+
+# Creates a new table with the Schema object from above.
+$table = $dataset | New-BqTable "logs2014" -Schema $schema
+
+
+
TableRow
+
+ Data is added and removed from Tables
in Rows. These rows are accessible using the
+ and
+ cmdlets. Add-BqTableRow
takes CSV, JSON, and AVRO files to import into BigQuery.
+
+
+# Ingests a CSV file and appends its rows onto the table 'my_dataset:my_table'.
+$table = New-BqTable "logs2014" -DatasetId "page_views"
+$table | Add-BqTableRow CSV $filename -SkipLeadingRows 1 `
+ -WriteMode WriteAppend
+
+# Returns a list of the rows in 'page_views:logs2014'.
+$list = Get-BqTable "logs2014" -DatasetID "page_views" | Get-BqTableRow
+
+
+
Jobs
+
+ There are four types of Jobs
: Query, Load, Extract, and Copy. Query jobs run SQL style
+ queries and output results to tables. Load jobs import Google Cloud Storage files into BigQuery.
+ Extract jobs export BigQuery tables to GCS. Copy jobs copy an existing table to another new or
+ existing table. starts any of these kinds of jobs as an asynchronous
+ operation. Use the -PollUntilComplete
flag to have the cmdlet block until the job is done.
+ will return the results of a query job once it is
+ finished. will return a reference object detailing the
+ current state and statistics on the job. will send a
+ request to the server to stop a certain job, and then returns immediately.
+
+ Note on formatting table names within query strings: BigQuery format specifies that table names
+ should be surrounded by backticks (`), but backticks are also Powershell's escape operators.
+ Because of this, backticks must be escaped by adding a second backtick. See the example below.
+
+
+
+# Query Job: starts a query and outputs results into $table.
+Start-BqJob -Query "SELECT * FROM ``page_views:logs2014``" `
+ -Destination $table
+
+
+# Load Job: adds TableRows to $table from the GCS file specified.
+$job = $dest_table | Start-BqJob `
+ -Load CSV "gs://page_views/server_logs_raw_2014.csv"
+
+
+# Extract Job: exports $src_table to a GCS file.
+$job = $src_table | Start-BqJob `
+ -Extract CSV "gs://page_views/logs2014.csv"
+
+
+# Copy Job: Starts a copy job, cancels it, and polls until the job is complely done.
+$job = $table | Start-BqJob -Copy $dest_table
+$result = $job | Stop-BqJob
+while ($result.Status.State -ne "DONE") {
+ $result = $result | Get-BqJob
+}
+
+
+
+
+
+
+ Google Cloud DNS is a high-performance, resilient, global Domain Name
+ System (DNS) service that publishes your domain names to the global DNS
+ in a cost-effective way. You can use Google Cloud DNS to publish your
+ zones and records in the DNS without the burden of managing your own DNS
+ servers and software.
+
+
Managed Zones
+
+ In Cloud DNS, a managed zone models a DNS zone and holds DNS records for
+ the same DNS name suffix (e.g., dnsexample.com.
). You can add a zone
+ to your Google Cloud Console project using the
+ cmdlet. Each zone in your project must have a unique name and a unique
+ DNS name to specify its associated DNS name suffix.
+
+
+$project = "<your-project-name>"
+
+# Create a managed zone for the DNS suffix dnsexample.com.
+$zone = "<your-zone-name>"
+$dnsSuffix = "<dnsexample.com.>"
+Add-GcdManagedZone -Project $project -Name $zone -DnsName $dnsSuffix
+
+# List all the managed zones in your project.
+Get-GcdManagedZone -Project $project
+
Resource Record Sets
+
+ ResourceRecordSets in Cloud
+ DNS are DNS records that you can create using the
+ cmdlet and retrieve from a managed zone using the
+ cmdlet.
+
+
+ However, to actually add or remove records from a managed zone, you must send
+ a change request to the zone using the Add-GcdChange
cmdlet.
+
+
+# Create a new A-type resource record for "dnsexample.com." and point it to
+# an IPv4 address.
+$ipv4 = "107.1.23.134"
+$ARecord = New-GcdResourceRecordSet -Name $dnsSuffix -Rrdata $ipv4 -Type "A"
+
+# Add the record to your zone.
+Add-GcdChange -Project $project -Zone $zone -Add $ARecord
+
+# Retrieve the newly added A-type record.
+$ARecord = Get-GcdResourceRecordSet -Project $project -Zone $zone -Filter "A"
+
+# Remove the retrieved record from your zone.
+Add-GcdChange -Project $project -Zone $zone -Remove $ARecord
+
+
+
+
+
+
+ Google Cloud SQL lets you set-up, maintain, manage, and administer
+ your relational MySQL databases on Google's Cloud Platform.
+
+
Instances
+
+ Google Cloud SQL instances hold all of your MySQL databases
+ and their relevant data. To create an instance, you must first
+ create an Cloud SQL instance configuration. This requires, at the
+ minimum, a name for your instance, and a setting configuration,
+ which doesn't require anything.
+
+
+ After the configuration object has been made, the
+ cmdlet can be called to create that instance in a particular project.
+ If your active gcloud configuration has a project, the parameter is optional.
+
+
+$setting = New-GcSqlSettingConfig
+$instance = New-GcSqlInstanceConfig `
+ "my-instance-name" -SettingConfig $setting
+
+$instance | Add-GcSqlInstance -Project $myProjectName
+
Importing Data
+
+ MySQL dump filles and CSV files on either your local machine
+ or on a Google Cloud Storage Bucket can be imported to your instance's
+ databases with the .
+
+
+Import-GcSqlInstance "my-instance-name" "C:\Users\User\file.csv" `
+ "destination-database" "destination-table"
+
+
+
+
+
+ Google Cloud Pub/Sub is a fully-managed real-time messaging service that
+ allows you to send and receive messages between independent applications.
+
+
Publisher
+
+ The publisher application creates and sends messages to a topic.
+ The cmdlet can be called to create an instance
+ in a particular topic. If your active gcloud configuration has a project, you don't have to
+ use the -Project
parameter.
+
+
+# Creates topic "my-topic" in the default project.
+New-GcpsTopic -Topic "my-topic"
+
+ After the topic has been created, you can now publish messages to the topic using the
+ cmdlet.
+
+
+# Publishes the message with data "This is a test" to topic "my-topic".
+Publish-GcpsMessage -Data "This is a test" -Topic "my-topic"
+
+ To publish multiple messages to the same topic with a single request, you can use the
+ cmdlet to create an array of messages
+ and pass that to the cmdlet.
+
+
+# Creates two messages.
+$messageOne = New-GcpsMessage -Data "This is a test"
+$messageTwo = New-GcpsMessage -Data "Data" -Attributes @{"key" = "value"}
+
+# Publish the messages to topic "my-topic".
+Publish-GcpsMessage -Message @($messageOne, $messageTwo) -Topic "my-topic"
+
Subscriber
+
+ The subscriber application creates a subscription to a topic to receive messages from it.
+ The cmdlet can be called to create an instance
+ in a particular topic. If your active gcloud configuration has a project, you don't have to
+ use the -Project
parameter.
+
+
+ By default, the subscription created is a pull subscription, which means the subscriber will
+ pull the messages from the topic. You can create a push subscription (Pub/Sub will push messages
+ to the subscriber's chosen endpoint) with -PushEndpoint.
+
+
+# Creates pull subscription "pull-subscription" to topic "my-topic" in the default project.
+New-GcpsSubscription -Topic "my-topic" -Subscription "pull-subscription"
+
+# Creates push subscription "push-subscription" to topic "my-topic".
+New-GcpsSubscription -Topic "my-topic" `
+ -Subscription "push-subscription" `
+ -PushEndpoint "http://www.example.com"
+
+ To pull messages from a subscription, the cmdlet can
+ be used. By default, the cmdlet will block until at least one message is retrieved. To prevent blocking,
+ the switch -ReturnImmediately
can be used. The cmdlet can also automatically send
+ an acknowledgement for every retrieved message if the switch -AutoAck
is used. If not,
+ you will have to use the cmdlet to send the acknowledgement.
+ Unacknowledged messages will become available again for pulling after the acknowledgement deadline of the message expires.
+
+
+# Pulls messages from subscription "my-subscription" and sends out acknowledgement automatically.
+Get-GcpsMessage -Subscription "my-subscription" -AutoAck
+
+# Pulls messages from subscription "my-subscription" and sends out acknowledgement with Send-GcpsAck.
+$messages = Get-GcpsMessage -Subscription "my-subscription"
+Send-GcpsAck -InputObject $messages
+
+
+
+
+
+ Stackdriver Logging allows you to store, search, analyze, monitor and alert on log data
+ and events from Google Cloud Platform and Amazon Web Services.
+
+
Logs and Log Entries
+
+ A log is a named collection of log entries within the project. A log entry records status or an event.
+ The entry might be created by GCP services, AWS services, third party applications, or your own applications.
+ The "message" the log entry carries is called the payload, and it can be a simple string or structured data.
+ Each log entry indicates where it came from by including the name of a monitored resource.
+
+
+ The cmdlet can be used to create a log entry.
+ You will have to specify the log that the entry belongs to (if the log does not exist, it will
+ be created). To associate the log with a monitored resource, you can use the -MonitoredResource parameter.
+ By default, the log entry is associated with the "global" resource. To create a monitored resource,
+ use the cmdlet.
+
+
+# Creates a log entry in the log "my-log".
+New-GcLogEntry -LogName "my-log" -TextPayload "This is a log."
+
+# Creates a log entry associated with a Cloud SQL monitored resource.
+$resource = New-GcLogMonitoredResource -ResourceType "cloudsql_database" `
+ -Labels @{"project_id" = "my-project";
+ "database_id" = "id"}
+New-GcLogEntry -LogName "my-log" `
+ -TextPayload "This is a log." `
+ -MonitoredResource $resource
+
+ You can retrieve log entries with the cmdlet .
+
+
+# Gets all entries from log "my-log"
+Get-GcLogEntry -LogName "my-log"
+
+# Gets all entries associated with Google Cloud Engine instances.
+Get-GcLogEntry -ResourceName "gce_instance"
+
Log Sinks
+
+ To export log entries, you can create log sinks with the cmdlet .
+ Stackdriver Logging will match incoming log entries against your sinks and all log entries matching each sink
+ are then copied to the associated destination. Log entries that exist before the sink is created will not be exported.
+
+
+ Destinations for exported logs can be Google Cloud Storage Buckets, Google BigQuery Datasets
+ or Google Cloud Pub/Sub Topics.
+
+
+# Creates a log sink for log entries in the default project.
+# The entries will be sent to the GCS bucket "my-bucket".
+New-GcLogSink -Sink "my-sink" -GcsBucketDestination "my-bucket"
+
+# Creates a log sink for log entries in log "my-log".
+# The entries will be sent to the BigQuery data set "my_dataset".
+New-GcLogSink -Sink "my-sink" `
+ -LogName "my-log" `
+ -BigQueryDataSetDestination "my_dataset"
+
+# Creates a log sink for log entries that match the filter.
+# The entries will be sent to the Pub/Sub topic "my-topic".
+New-GcLogSink -Sink "my-sink" `
+ -Filter "textPayload = `"Testing`"" `
+ -PubSubTopicDestination "my-topic"
+
Log Metrics
+
+ You can create log metrics that count the number of log entries that match a certain criteria
+ with the cmdlet . These metrics can be used
+ to create charts and alerting policies in Stackdriver Monitoring.
+
+
+# Creates a metric for entries in log "my-log".
+New-GcLogMetric -Metric "my-metric" -LogName "my-log"
+
+# Creates a metric for entries associated with Google Cloud Engine instances.
+New-GcLogMetric -Metric "my-metric" -ResourceType "gce_instance"
+
+# Creates a metric for entries that match the filter.
+New-GcLogMetric -Metric "my-metric" -Filter "textPayload = `"Testing`""
+
+
+
+
+
+ Google Cloud Identity & Access Management (IAM) allows you manage fine-grainted access control
+ and visibility for centrally managing cloud resources.
+
+
IAM policy bindings
+
+ An IAM policy binding describes the access that an entity has to a cloud resources.
+ The cmdlet can be used to add an
+ IAM policy binding. You will have to specify the access level with -Role parameter.
+ The entity that the role applies to is specified with either -User, -Group, -ServiceAccount
+ or -Domain (which corresponds to a Google account email address, a Google group email address,
+ a service account email address and a domain respectively). If -Project parameter is not used,
+ the cmdlet will add the binding to resources in the default project.
+
+
+# Gives user test-user@google.com owner role in the project "my-project".
+Add-GcIamPolicyBinding -Role roles/owner -User test-user@google.com -Project "my-project"
+
+# Gives group test-group@google.com browser role in the default project.
+Add-GcIamPolicyBinding -Role roles/browser -Group test-group@google.com
+
+ You can view existing bindings with the cmdlet .
+ The cmdlet will use the default project if -Project parameter is not used.
+
+
+# Gets all IAM policy bindings in the project "my-project".
+Get-GcIamPolicyBinding -Project "my-project"
+
+ You can remove existing bindings with the cmdlet .
+ The cmdlet will not raise an error if the binding does not exist.
+
+
+# Removes the container admin role of the service account
+# service@project.iam.gserviceaccount.com in the default project.
+Remove-GcIamPolicyBinding -Role roles/container.admin -ServiceAccount service@project.iam.gserviceaccount.com
+
+# Removes the editor role of all users of the domain
+# example.com in the default project.
+Remove-GcIamPolicyBinding -Role roles/editor -Domain example.com
+
+
+
+
+
+ Google Cloud Project cmdlets let you manage your project.
+
+
Google Cloud Projects
+
+ The cmdlet lists all Google Cloud projects
+ that you have access to.
+
+
+# Lists all available Google Cloud projects.
+Get-GcpProject
+
+
+
+
+
+
+
All Resources
+
+
diff --git a/js/app/controllers/content-resource.ng b/js/app/controllers/content-resource.ng
new file mode 100644
index 00000000..2a793dab
--- /dev/null
+++ b/js/app/controllers/content-resource.ng
@@ -0,0 +1,28 @@
+