- What is the IBM Quantum Developer Certification and how to pass it - October 28, 2022
- Quantum Machine Learning: aktuelle Perspektiven und Potenziale - October 5, 2022
- Getting Started with Quantum Computing Practically or βHow does Quantum look like?β - September 16, 2022
What is Terraform CDK
CDK for Terraform is an Extension from Hashicorp for Terraform (opens a new tab). It basically allows us to utilize the power of a programming language while building our Terraform YAML Files. A beautiful depiction of how and where CDKTF fits in the Cloud-Terraform Family can be found in the github repository:

Why tfcdk with Python?
You can choose from a range of programming languages to code your terraform cdk codes on. Ginkgo-Analytics is, as you might assume, a company focused on Analytics, Data & AI. Thus Python is our language of choice for most of our cases.
Also, it will be easier to maintain a Python Code base given the fact that Python is very easy to learn and the talent pool for python is bigger than the talent pool for C# or TypeScript.
cdktf and AWS
As mentioned earlier, CDKtf is under active development, which means you should expect a general Alpha/Beta behavior for this package (it’s a nice way to say that it will have bugs). After playing around with cdktf I realized that it has a more stable foundation for AWS Resources compared to Google Cloud. This is why we are going to do a simple AWS CDKtf Bootstrap.
Creating Cloud Resources with Code
First, let us make use of the built-in cdktf templates that will provide most of the project files structure for us. You will need to install cdktf and pipenv with pip, conda or from source.
Initiating a cdktf Python Template
$> cdktf init --template="python" --local
Creating a virtualenv for this project...
Pipfile: /Users/abk/Documents/ginkgo-analytics-articles/cdktf-aws/Pipfile
Using /usr/bin/python3 (3.8.9) to create virtualenv...
β Creating virtual environment...created virtual environment CPython3.8.9.final.0-64 in 687ms
β Successfully created virtual environment!
Virtualenv location: /Users/abk/.local/share/virtualenvs/cdktf-aws-dY-BuV67
Pipfile.lock not found, creating...
Locking [dev-packages] dependencies...
Locking [packages] dependencies...
Updated Pipfile.lock (c0c5a6)!
Installing dependencies from Pipfile.lock (c0c5a6)...
π ββββββββββββββββββββββββββββββββ 0/0 β 00:00:00
β Installation Succeeded
β Success!
Your folder structure should now look like the following picture:

pipenv shell
Getting AWS provider resources
If you haven’t used terraform and aws providers before, you need to execute the next step. Otherwise skip it.
Add the aws provider and it’s modules to cdktf.json in the list of the key “terraformProviders” and “terraformModules”. In “terraformModules” you need provide all the aws resource types that will be used in your Python Code. You can see all up-to-date resources and their version under https://github.com/terraform-aws-modules.
{
"language": "python",
"app": "pipenv run python main.py",
"terraformProviders": ["hashicorp/aws@~> 3.67.0"],
"terraformModules": [
{
"name": "AwsLambda",
"source": "terraform-aws-modules/lambda/aws",
"version": "2.34.0"
}
],
"codeMakerOutput": "imports",
"context": {
"excludeStackIdFromLogicalIds": "true",
"allowSepCharsInLogicalIds": "true"
}
}
Run
cdktf get
Generated python constructs in the output directory: imports
If everything worked correctly you should see a new folder in your root folder named “imports”. This folder will contain resources that we will use in our Python Classes to define AWS Resources we want to create.

Defining AWS Resources in Python
Only now can we do the actual work of defining our pre-defined cloud architecture. In general Hashicorp decided to split environments (prod, dev, test) into Stacks. Basically, these are python classes. Since you can re-initialize the provider data in each stack (like region) and the outputs are in separated files, this should work just fine for most projects.

#!/usr/bin/env python
from constructs import Construct
from cdktf import App, TerraformStack, TerraformOutput
from imports.aws import AwsProvider
from imports.AwsLambda import AwsLambda
class MyStack(TerraformStack):
def __init__(self, scope: Construct, ns: str):
super().__init__(scope, ns)
# define resources here
AwsProvider(self, "Aws", region="us-west-1")
ga_lambda = AwsLambda(self,
"ginkgo-analytics",
create=True,
handler="lambda_function.lambda_handler",
runtime="python3.8",
function_name="ginkgo-analytics-cdktf",
lambda_role="arn:aws:iam::806903271389:role/cdktf_lambda",
source_path="/Users/abk/Documents/ginkgo-analytics-articles/cdktf-aws/lambda"
)
print(dir(ga_lambda))
TerraformOutput(self, "lambda_obj", value=ga_lambda )
app = App()
MyStack(app, "cdktf-aws")
app.synth()
Similar to using modules (e.g. lambda) from certain cloud providers (e.g. aws) in plain terraform, here we use modules that actually are python packages like “aws” and “AwsLambda”. From there, you can consult either the python package code or Terraform Module Documentation to figure out what parameters each Module needs. In our case, Lambda needs parameter such as runtime, funciton_name, source_path, etc. I did two things behind the scenes: create a lambda role “cdktf_lambda” and created a folder with a file “lambda_function.py” that will be eventually uploaded by terraform cdktf:
def lambda_handler(event, context):
return {"message": "Ginkgo Analytics Lambda deployed with CDKtf & AWS!"}
Executing and Evaluating Resources
Time to deploy and test our Lambda Function. To convert python code into json/yaml terraform-format, you can run
cdktf synth
The command “cdktf deploy” on the other hand will run “cdktf synth” in the background and then deploy the resources. So in practice, it’s enough to just run “cdktf deploy” and omit “cdktf synth”.
(cdktf-aws) (base) ~/D/g/cdktf-aws β―β―β― cdktf deploy β 1
Stack: cdktf-aws
Resources
+ MODULE ginkgo-analytics module.ginkgo-analytics.aws_cloudwatch_
log_group.lambda[0]
+ MODULE ginkgo-analytics module.ginkgo-analytics.aws_iam_policy.
logs[0]
+ MODULE ginkgo-analytics module.ginkgo-analytics.aws_iam_role.la
mbda[0]
+ MODULE ginkgo-analytics module.ginkgo-analytics.aws_iam_role_po
licy_attachment.logs[0]
+ MODULE ginkgo-analytics module.ginkgo-analytics.aws_lambda_func
tion.this[0]
~ MODULE ginkgo-analytics module.ginkgo-analytics.data.aws_iam_po
licy_document.logs[0]
+ MODULE ginkgo-analytics module.ginkgo-analytics.local_file.arch
ive_plan[0]
+ MODULE ginkgo-analytics module.ginkgo-analytics.null_resource.a
rchive[0]
Diff: 7 to create, 0 to update, 0 to delete.
Do you want to perform these actions?
CDK for Terraform will perform the actions described above.
Only 'yes' will be accepted to approve.
Enter a value:yes
Great, our lambda is deployed, active and working as expected:

Now is your turn to try out cdktf. Go ahead and implement an AWS API Gateway for this newly created lambda function. This might help: https://github.com/terraform-aws-modules/terraform-aws-apigateway-v2.
Sources
https://learn.hashicorp.com/tutorials/terraform/cdktf-build-python?in=terraform/cdktf
https://github.com/terraform-aws-modules/terraform-aws-apigateway-v2