To install provided module following prerequisites needs to be satisfied:
- AWS CDK v2 CLI - https://docs.aws.amazon.com/cdk/v2/guide/cli.html
- TLDR:
npm install -g aws-cdk
- TLDR:
- Docker - https://docs.docker.com/get-docker/
- requirements from
requirements.txt
file installed into your python environment - At least 2 accounts:
- AWS Organizations Management account
- AWS account for Deployment and Execution
- Deployment and Execution can be separate AWS accounts if required.
-
Modify cdk.context.json with appropriate accounts information as well as variables and commit changes. At the bare minimum the following 4 parameters need to be changed:
- enterprise_sso_management_account_id: AWS Account Id of the AWS Organization Management Account
- enterprise_sso_exec_account_id: AWS Account Id where the application will be running in. Should NOT be the same as the AWS Organization management account.
- enterprise_sso_deployment_account_id: AWS Account Id that will have the AWS CodePipeline pipeline deployed to. Can be the same as enterprise_sso_exec_account_id
- error_notifications_email: Notification email for error messages
Make sure to commit these changes ot the local repository, or these changes will not propagate to AWS CodeCommit
-
Set
AWS_DEFAULT_REGION
environment variables to the desired value -
Bootstrap all AWS accounts using the new bootstrap style. More information here(you can skip --profile is you are using ENV variables for providing AWS access credentials). You can deploy this solution in multiple regions or only
us-east-1
bootstrap instead and deploy everything to a single region. Event bridge configuration and support pipelines stack related to AWS Organization and AWS SSO will always be deployed tous-east-1
region, thus requires a bootstrap in that region.-
Bootstrap deployment account (
us-east-1
):env CDK_NEW_BOOTSTRAP=1 cdk bootstrap \ --profile deployment_profile \ --cloudformation-execution-policies arn:aws:iam::aws:policy/AdministratorAccess \ aws://111111111111/us-east-1
-
If deploying to another region (set in the
AWS_DEFAULT_REGION
variable), bootstrap deployment account for the additional region:env CDK_NEW_BOOTSTRAP=1 cdk bootstrap \ --profile deployment_profile \ --cloudformation-execution-policies arn:aws:iam::aws:policy/AdministratorAccess \ aws://111111111111/$AWS_DEFAULT_REGION
-
Bootstrap management account:
env CDK_NEW_BOOTSTRAP=1 cdk bootstrap \ --profile management_profile \ --cloudformation-execution-policies arn:aws:iam::aws:policy/AdministratorAccess \ --trust 11111111111 \ aws://222222222222/us-east-1
-
If deploying to another region (set in the
AWS_DEFAULT_REGION
variable), bootstrap deployment account for the additional region:env CDK_NEW_BOOTSTRAP=1 cdk bootstrap \ --profile management_profile \ --cloudformation-execution-policies arn:aws:iam::aws:policy/AdministratorAccess \ --trust 11111111111 \ aws://222222222222/$AWS_DEFAULT_REGION
-
Bootstrap iam account (can be skipped if using 2 accounts model):
env CDK_NEW_BOOTSTRAP=1 cdk bootstrap \ --profile iam_profile \ --cloudformation-execution-policies arn:aws:iam::aws:policy/AdministratorAccess \ --trust 11111111111 \ aws://3333333333333/$AWS_DEFAULT_REGION
-
-
Setup environment variables for accessing the deployment AWS account.
-
For an initial deployment, the initial_deployment.py script can be used, which creates a codecommit repository and pushes the code using settings from
cdk.context.json
.- Install requirements from
initial-deploy-requirements.txt
- Make sure changes to
cdk.context.json
are committed to the local repository. - Execute
python3 initial_deployment.py
- the
--no-history
flag can be used to not preserve git history if desired.
- the
- Install requirements from
-
Once the repository exists and the code is pushed to it, execute
cdk deploy EnterpriseAWSSSOPipelineStack
. For all further changes, the newly created pipeline will be triggered for commits to themain
branch. -
Now all the manual deployment steps have been completed. AWS CodePipeline will deploy everything else automatically. You can track the progress from the deployment AWS Account by opening up the AWS Codepipeline console.
This solution is event driven, utilizing a custom AWS EventBridge EventBus in the IAM account (the name of the bus is defined in the target_event_bus_name
context variable in cdk.context.json
). The following following events are supported:
In order to manipulate AWS SSO assignments following event structure is used:
{
"source": "permissionEventSource",
"detail": {
"permissions": [
{
"ActionType": "Add", //Possible values "Add" or "Remove"
"PermissionFor": "OrganizationalUnit", //Possible values "OrganizationalUnit"|"Account"|"Tag"|"Root"
"OrganizationalUnitName": "OU_Name",
"AccountNumber": 30010047,
"Tag": "key=value",
"GroupName": "GroupX",
"UserName": "User Name",
"PermissionSetName": "AWSReadOnlyAccess"
}
]
}
}
Based on the type of user entity (user or group) and permission abstraction. Different fields are used. Examples:
{
"source": "permissionEventSource",
"detail": {
"permissions": [
{
"ActionType": "Add",
"PermissionFor": "Account",
"AccountNumber": 1234567890123,
"UserName": "User Name",
"PermissionSetName": "AWSReadOnlyAccess"
}
]
}
}
{
"source": "permissionEventSource",
"detail": {
"permissions": [
{
"ActionType": "Add",
"PermissionFor": "OrganizationalUnit",
"OrganizationalUnitName": "OU_Name",
"GroupName": "GroupX",
"PermissionSetName": "AWSReadOnlyAccess"
}
]
}
}
{
"source": "permissionEventSource",
"detail": {
"permissions": [
{
"ActionType": "Remove",
"PermissionFor": "Tag",
"Tag": "key=value",
"GroupName": "GroupX",
"PermissionSetName": "AWSReadOnlyAccess"
}
]
}
}
Events mentioned above will create records in DynamoDB, and trigger corresponding action in AWS SSO. DynamoDB acts as a single point of truth, for any following actions. Having such records in DynamoDB will allow automatic assignment/removal of AWS SSO permission when moving accounts between OU as well as creating new accounts in OU.
- As of current state does not support nested OU's
- Testing is currently limited
- Support of ResourceTagged AWS Organization is removed for now due to multiple processing options
Python tests are executed using pytest package.
Make sure that folder ./src/layers/
is added to PYTHONPATH
variable as test are depended on the code fined in lambda layers.
To execute test pass the test pass the test file to the pytest:
python3 -m pytest -v src
This will load mock classes from the layers folder and run complete test suite.
This project is best executed from a virtualenv.
To manually create a virtualenv on MacOS and Linux:
python3 -m venv .venv
After the init process completes and the virtualenv is created, you can use the following step to activate your virtualenv.
source .venv/bin/activate
If you are a Windows platform, you would activate the virtualenv like this:
% .venv\Scripts\activate.bat
Once the virtualenv is activated, you can install the required dependencies.
pip install -r requirements.txt
At this point you can now synthesize the CloudFormation template for this code.
cdk synth
cdk ls
list all stacks in the appcdk synth
emits the synthesized CloudFormation templatecdk deploy
deploy this stack to your default AWS account/regioncdk diff
compare deployed stack with current statecdk docs
open CDK documentation
See CONTRIBUTING for more information.
This library is licensed under the MIT-0 License. See the LICENSE file.