1
0
mirror of https://github.com/bitwarden/directory-connector synced 2026-01-10 20:43:52 +00:00

Compare commits

..

1 Commits

Author SHA1 Message Date
Thomas Rittson
b830c8be27 [PM-26454] Undo removal of core-js to fix dynamic import errors (#890)
* Undo removal of core-js to fix dynamic import errors

* chore: update package-lock with npm install

---------

Co-authored-by: Vincent Salucci <vincesalucci21@gmail.com>
2025-10-02 11:13:45 -05:00
76 changed files with 7034 additions and 7163 deletions

View File

@@ -1,203 +0,0 @@
# Bitwarden Directory Connector
## Project Overview
Directory Connector is a TypeScript application that synchronizes users and groups from directory services to Bitwarden organizations. It provides both a desktop GUI (built with Angular and Electron) and a CLI tool (bwdc).
**Supported Directory Services:**
- LDAP (Lightweight Directory Access Protocol) - includes Active Directory and general LDAP servers
- Microsoft Entra ID (formerly Azure Active Directory)
- Google Workspace
- Okta
- OneLogin
**Technologies:**
- TypeScript
- Angular (GUI)
- Electron (Desktop wrapper)
- Node
- Jest for testing
## Code Architecture & Structure
### Directory Organization
```
src/
├── abstractions/ # Interface definitions (e.g., IDirectoryService)
├── services/ # Business logic implementations for directory services, sync, auth
├── models/ # Data models (UserEntry, GroupEntry, etc.)
├── commands/ # CLI command implementations
├── app/ # Angular GUI components
└── utils/ # Test utilities and fixtures
src-cli/ # CLI-specific code (imports common code from src/)
jslib/ # Legacy folder structure (mix of deprecated/unused and current code - new code should not be added here)
```
### Key Architectural Patterns
1. **Abstractions = Interfaces**: All interfaces are defined in `/abstractions`
2. **Services = Business Logic**: Implementations live in `/services`
3. **Directory Service Pattern**: Each directory provider implements `IDirectoryService` interface
4. **Separation of Concerns**: GUI (Angular app) and CLI (commands) share the same service layer
## Development Conventions
### Code Organization
**File Naming:**
- kebab-case for files: `ldap-directory.service.ts`
- Descriptive names that reflect purpose
**Class/Function Naming:**
- PascalCase for classes and interfaces
- camelCase for functions and variables
- Descriptive names that indicate purpose
**File Structure:**
- Keep files focused on single responsibility
- Create new service files for distinct directory integrations
- Separate models into individual files when complex
### TypeScript Conventions
**Import Patterns:**
- Use path aliases (`@/`) for project imports
- `@/` - project root
- `@/jslib/` - jslib folder
- ESLint enforces alphabetized import ordering with newlines between groups
**Type Safety:**
- Avoid `any` types - use proper typing or `unknown` with type guards
- Prefer interfaces for contracts, types for unions/intersections
- Use strict null checks - handle `null` and `undefined` explicitly
- Leverage TypeScript's type inference where appropriate
**Configuration:**
- Use configuration files or environment variables
- Never hardcode URLs or configuration values
## Security Best Practices
**Credential Handling:**
- Never log directory service credentials, API keys, or tokens
- Use secure storage mechanisms for sensitive data
- Credentials should never be hardcoded
- Store credentials encrypted, never in plain text
**Sensitive Data:**
- User and group data from directories should be handled securely
- Avoid exposing sensitive information in error messages
- Sanitize data before logging
- Be cautious with data persistence
**Input Validation:**
- Validate and sanitize data from external directory services
- Check for injection vulnerabilities (LDAP injection, etc.)
- Validate configuration inputs from users
**API Security:**
- Ensure authentication flows are implemented correctly
- Verify SSL/TLS is used for all external connections
- Check for secure token storage and refresh mechanisms
## Error Handling
**Best Practices:**
1. **Try-catch for async operations** - Always wrap external API calls
2. **Meaningful error messages** - Provide context for debugging
3. **Error propagation** - Don't swallow errors silently
4. **User-facing errors** - Separate user messages from developer logs
## Performance Best Practices
**Large Dataset Handling:**
- Use pagination for large user/group lists
- Avoid loading entire datasets into memory at once
- Consider streaming or batch processing for large operations
**API Rate Limiting:**
- Respect rate limits for Microsoft Graph API, Google Admin SDK, etc.
- Consider batching large API calls where necessary
**Memory Management:**
- Close connections and clean up resources
- Remove event listeners when components are destroyed
- Be cautious with caching large datasets
## Testing
**Framework:**
- Jest with jest-preset-angular
- jest-mock-extended for type-safe mocks with `mock<Type>()`
**Test Organization:**
- Tests colocated with source files
- `*.spec.ts` - Unit tests for individual components/services
- `*.integration.spec.ts` - Integration tests against live directory services
- Test helpers located in `utils/` directory
**Test Naming:**
- Descriptive, human-readable test names
- Example: `'should return empty array when no users exist in directory'`
**Test Coverage:**
- New features must include tests
- Bug fixes should include regression tests
- Changes to core sync logic or directory specific logic require integration tests
**Testing Approach:**
- **Unit tests**: Mock external API calls using jest-mock-extended
- **Integration tests**: Use live directory services (Docker containers or configured cloud services)
- Focus on critical paths (authentication, sync, data transformation)
- Test error scenarios and edge cases (empty results, malformed data, connection failures), not just happy paths
## Directory Service Patterns
### IDirectoryService Interface
All directory services implement this core interface with methods:
- `getUsers()` - Retrieve users from directory and transform them into standard objects
- `getGroups()` - Retrieve groups from directory and transform them into standard objects
- Connection and authentication handling
### Service-Specific Implementations
Each directory service has unique authentication and query patterns:
- **LDAP**: Direct LDAP queries, bind authentication
- **Microsoft Entra ID**: Microsoft Graph API, OAuth tokens
- **Google Workspace**: Google Admin SDK, service account credentials
- **Okta/OneLogin**: REST APIs with API tokens
## References
- [Architectural Decision Records (ADRs)](https://contributing.bitwarden.com/architecture/adr/)
- [Contributing Guidelines](https://contributing.bitwarden.com/contributing/)
- [Code Style](https://contributing.bitwarden.com/contributing/code-style/)
- [Security Whitepaper](https://bitwarden.com/help/bitwarden-security-white-paper/)
- [Security Definitions](https://contributing.bitwarden.com/architecture/security/definitions)

10
.eslintignore Normal file
View File

@@ -0,0 +1,10 @@
dist
build
build-cli
webpack.cli.js
webpack.main.js
webpack.renderer.js
**/node_modules
**/jest.config.js

95
.eslintrc.json Normal file
View File

@@ -0,0 +1,95 @@
{
"root": true,
"env": {
"browser": true,
"node": true
},
"overrides": [
{
"files": ["*.ts", "*.js"],
"plugins": ["@typescript-eslint", "rxjs", "rxjs-angular", "import"],
"parser": "@typescript-eslint/parser",
"parserOptions": {
"project": ["./tsconfig.eslint.json"],
"sourceType": "module",
"ecmaVersion": 2020
},
"extends": [
"eslint:recommended",
"plugin:@typescript-eslint/recommended",
"plugin:import/recommended",
"plugin:import/typescript",
"prettier",
"plugin:rxjs/recommended"
],
"settings": {
"import/parsers": {
"@typescript-eslint/parser": [".ts"]
},
"import/resolver": {
"typescript": {
"alwaysTryTypes": true
}
}
},
"rules": {
"@typescript-eslint/explicit-member-accessibility": [
"error",
{ "accessibility": "no-public" }
],
"@typescript-eslint/no-explicit-any": "off", // TODO: This should be re-enabled
"@typescript-eslint/no-misused-promises": ["error", { "checksVoidReturn": false }],
"@typescript-eslint/no-this-alias": ["error", { "allowedNames": ["self"] }],
"@typescript-eslint/no-unused-vars": ["error", { "args": "none" }],
"no-console": "error",
"import/no-unresolved": "off", // TODO: Look into turning off once each package is an actual package.
"import/order": [
"error",
{
"alphabetize": {
"order": "asc"
},
"newlines-between": "always",
"pathGroups": [
{
"pattern": "@/jslib/**/*",
"group": "external",
"position": "after"
},
{
"pattern": "@/src/**/*",
"group": "parent",
"position": "before"
}
],
"pathGroupsExcludedImportTypes": ["builtin"]
}
],
"rxjs-angular/prefer-takeuntil": "error",
"rxjs/no-exposed-subjects": ["error", { "allowProtected": true }],
"no-restricted-syntax": [
"error",
{
"message": "Calling `svgIcon` directly is not allowed",
"selector": "CallExpression[callee.name='svgIcon']"
},
{
"message": "Accessing FormGroup using `get` is not allowed, use `.value` instead",
"selector": "ChainExpression[expression.object.callee.property.name='get'][expression.property.name='value']"
}
],
"curly": ["error", "all"],
"import/namespace": ["off"], // This doesn't resolve namespace imports correctly, but TS will throw for this anyway
"no-restricted-imports": ["error", { "patterns": ["src/**/*"] }]
}
},
{
"files": ["*.html"],
"parser": "@angular-eslint/template-parser",
"plugins": ["@angular-eslint/template"],
"rules": {
"@angular-eslint/template/button-has-type": "error"
}
}
]
}

11
.github/CODEOWNERS vendored
View File

@@ -6,14 +6,3 @@
# Default file owners.
* @bitwarden/team-admin-console-dev
# Docker-related files
**/Dockerfile @bitwarden/team-appsec @bitwarden/dept-bre
**/*.dockerignore @bitwarden/team-appsec @bitwarden/dept-bre
**/entrypoint.sh @bitwarden/team-appsec @bitwarden/dept-bre
**/docker-compose.yml @bitwarden/team-appsec @bitwarden/dept-bre
# Claude related files
.claude/ @bitwarden/team-ai-sme
.github/workflows/respond.yml @bitwarden/team-ai-sme
.github/workflows/review-code.yml @bitwarden/team-ai-sme

View File

@@ -1,14 +0,0 @@
blank_issues_enabled: false
contact_links:
- name: Feature Requests
url: https://community.bitwarden.com/c/feature-requests/
about: Request new features using the Community Forums. Please search existing feature requests before making a new one.
- name: Bitwarden Community Forums
url: https://community.bitwarden.com
about: Please visit the community forums for general community discussion, support and the development roadmap.
- name: Customer Support
url: https://bitwarden.com/contact/
about: Please contact our customer support for account issues and general customer support.
- name: Security Issues
url: https://hackerone.com/bitwarden
about: We use HackerOne to manage security disclosures.

View File

@@ -1,111 +0,0 @@
name: Directory Connector Bug Report
description: File a bug report
title: "[DC] "
labels: ["bug"]
type: bug
body:
- type: markdown
attributes:
value: |
Thanks for taking the time to fill out this bug report!
Please do not submit feature requests. The [Community Forums](https://community.bitwarden.com) has a section for submitting, voting for, and discussing product feature requests.
- type: textarea
id: reproduce
attributes:
label: Steps To Reproduce
description: How can we reproduce the behavior.
value: |
1. Go to '...'
2. Click on '....'
3. Scroll down to '....'
4. Click on '...'
validations:
required: true
- type: textarea
id: expected
attributes:
label: Expected Result
description: A clear and concise description of what you expected to happen.
validations:
required: true
- type: textarea
id: actual
attributes:
label: Actual Result
description: A clear and concise description of what is happening.
validations:
required: true
- type: textarea
id: screenshots
attributes:
label: Screenshots or Videos
description: If applicable, add screenshots and/or a short video to help explain your problem.
- type: textarea
id: additional-context
attributes:
label: Additional Context
description: Add any other context about the problem here.
- type: dropdown
id: os
attributes:
label: Operating System
description: What operating system(s) are you seeing the problem on?
multiple: true
options:
- Windows
- macOS
- Linux
- Other operating system (please specify in "Additional Context" section)
validations:
required: true
- type: input
id: os-version
attributes:
label: Operating System Version
description: What version of the operating system(s) are you seeing the problem on?
validations:
required: true
- type: dropdown
id: directories
attributes:
label: Directory Service
description: What directory service(s) are you seeing the problem on?
multiple: true
options:
- LDAP - Active Directory
- Another LDAP implementation (please specify in "Additional Context" section)
- Microsoft Entra ID
- Google Workspace
- Okta Universal Directory
- OneLogin
- Other directory service (please specify in "Additional Context" section)
validations:
required: true
- type: dropdown
id: application-type
attributes:
label: Application Type
description: Which Directory Connector application(s) are you seeing the problem on?
multiple: true
options:
- GUI (the desktop application)
- CLI (the bwdc command line application)
validations:
required: true
- type: input
id: version
attributes:
label: Build Version
description: What version of our software are you running?
validations:
required: true
- type: checkboxes
id: issue-tracking-info
attributes:
label: Issue Tracking Info
description: |
Make sure to acknowledge the following before submitting your report!
options:
- label: I understand that work is tracked outside of Github. A PR will be linked to this issue should one be opened to address it, but Bitwarden doesn't use fields like "assigned", "milestone", or "project" to track progress.
required: true

View File

@@ -8,6 +8,12 @@
matchManagers: ["github-actions"],
matchUpdateTypes: ["minor", "patch"],
},
{
groupName: "Google Libraries",
matchPackagePatterns: ["google-auth-library", "googleapis"],
matchManagers: ["npm"],
groupSlug: "google-libraries",
},
],
ignoreDeps: [
// yao-pkg is used to create a single executable application bundle for the CLI.
@@ -15,10 +21,5 @@
// This must be manually vetted by our appsec team before upgrading.
// It is excluded from renovate to avoid accidentally upgrading to a non-vetted version.
"@yao-pkg/pkg",
// googleapis uses ESM after 149.0.0 so we are not upgrading it until we have ESM support.
// They release new versions every couple of weeks so ignoring it at the dependency dashboard
// level is not sufficient.
// FIXME: remove and upgrade when we have ESM support.
"googleapis",
],
}

View File

@@ -23,22 +23,20 @@ jobs:
node_version: ${{ steps.retrieve-node-version.outputs.node_version }}
steps:
- name: Checkout repo
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
with:
persist-credentials: false
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- name: Get Package Version
id: retrieve-version
run: |
PKG_VERSION=$(jq -r .version package.json)
echo "package_version=$PKG_VERSION" >> "$GITHUB_OUTPUT"
echo "package_version=$PKG_VERSION" >> $GITHUB_OUTPUT
- name: Get Node Version
id: retrieve-node-version
run: |
NODE_NVMRC=$(cat .nvmrc)
NODE_VERSION=${NODE_NVMRC/v/''}
echo "node_version=$NODE_VERSION" >> "$GITHUB_OUTPUT"
echo "node_version=$NODE_VERSION" >> $GITHUB_OUTPUT
linux-cli:
name: Build Linux CLI
@@ -51,12 +49,10 @@ jobs:
contents: read
steps:
- name: Checkout repo
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
with:
persist-credentials: false
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- name: Set up Node
uses: actions/setup-node@395ad3262231945c25e8478fd5baf05154b1d79f # v6.1.0
uses: actions/setup-node@a0853c24544627f65ddf259abe73b1d18a591444 # v5.0.0
with:
cache: 'npm'
cache-dependency-path: '**/package-lock.json'
@@ -65,7 +61,7 @@ jobs:
- name: Update NPM
run: |
npm install -g node-gyp
node-gyp install "$(node -v)"
node-gyp install $(node -v)
- name: Keytar
run: |
@@ -76,8 +72,8 @@ jobs:
keytarUrl="https://github.com/atom/node-keytar/releases/download/v$keytarVersion/$keytarTarGz"
mkdir -p ./keytar/linux
wget "$keytarUrl" -O "./keytar/linux/$keytarTarGz"
tar -xvf "./keytar/linux/$keytarTarGz" -C ./keytar/linux
wget $keytarUrl -O ./keytar/linux/$keytarTarGz
tar -xvf ./keytar/linux/$keytarTarGz -C ./keytar/linux
- name: Install
run: npm install
@@ -86,19 +82,19 @@ jobs:
run: npm run dist:cli:lin
- name: Zip
run: zip -j "dist-cli/bwdc-linux-$_PACKAGE_VERSION.zip" "dist-cli/linux/bwdc" "keytar/linux/build/Release/keytar.node"
run: zip -j dist-cli/bwdc-linux-$_PACKAGE_VERSION.zip dist-cli/linux/bwdc keytar/linux/build/Release/keytar.node
- name: Version Test
run: |
sudo apt-get update
sudo apt install libsecret-1-0 dbus-x11 gnome-keyring
eval "$(dbus-launch --sh-syntax)"
eval $(dbus-launch --sh-syntax)
eval "$(echo -n "" | /usr/bin/gnome-keyring-daemon --login)"
eval "$(/usr/bin/gnome-keyring-daemon --components=secrets --start)"
eval $(echo -n "" | /usr/bin/gnome-keyring-daemon --login)
eval $(/usr/bin/gnome-keyring-daemon --components=secrets --start)
mkdir -p test/linux
unzip "./dist-cli/bwdc-linux-$_PACKAGE_VERSION.zip" -d ./test/linux
unzip ./dist-cli/bwdc-linux-$_PACKAGE_VERSION.zip -d ./test/linux
testVersion=$(./test/linux/bwdc -v)
@@ -111,7 +107,7 @@ jobs:
fi
- name: Upload Linux Zip to GitHub
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
uses: actions/upload-artifact@65c4c4a1ddee5b72f698fdd19549f0f0fb45cf08 # v4.6.0
with:
name: bwdc-linux-${{ env._PACKAGE_VERSION }}.zip
path: ./dist-cli/bwdc-linux-${{ env._PACKAGE_VERSION }}.zip
@@ -120,7 +116,7 @@ jobs:
macos-cli:
name: Build Mac CLI
runs-on: macos-15-intel
runs-on: macos-13
needs: setup
permissions:
contents: read
@@ -129,12 +125,10 @@ jobs:
_NODE_VERSION: ${{ needs.setup.outputs.node_version }}
steps:
- name: Checkout repo
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
with:
persist-credentials: false
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- name: Set up Node
uses: actions/setup-node@395ad3262231945c25e8478fd5baf05154b1d79f # v6.1.0
uses: actions/setup-node@a0853c24544627f65ddf259abe73b1d18a591444 # v5.0.0
with:
cache: 'npm'
cache-dependency-path: '**/package-lock.json'
@@ -143,7 +137,7 @@ jobs:
- name: Update NPM
run: |
npm install -g node-gyp
node-gyp install "$(node -v)"
node-gyp install $(node -v)
- name: Keytar
run: |
@@ -154,8 +148,8 @@ jobs:
keytarUrl="https://github.com/atom/node-keytar/releases/download/v$keytarVersion/$keytarTarGz"
mkdir -p ./keytar/macos
wget "$keytarUrl" -O "./keytar/macos/$keytarTarGz"
tar -xvf "./keytar/macos/$keytarTarGz" -C ./keytar/macos
wget $keytarUrl -O ./keytar/macos/$keytarTarGz
tar -xvf ./keytar/macos/$keytarTarGz -C ./keytar/macos
- name: Install
run: npm install
@@ -164,12 +158,12 @@ jobs:
run: npm run dist:cli:mac
- name: Zip
run: zip -j "dist-cli/bwdc-macos-$_PACKAGE_VERSION.zip" "dist-cli/macos/bwdc" "keytar/macos/build/Release/keytar.node"
run: zip -j dist-cli/bwdc-macos-$_PACKAGE_VERSION.zip dist-cli/macos/bwdc keytar/macos/build/Release/keytar.node
- name: Version Test
run: |
mkdir -p test/macos
unzip "./dist-cli/bwdc-macos-$_PACKAGE_VERSION.zip" -d ./test/macos
unzip ./dist-cli/bwdc-macos-$_PACKAGE_VERSION.zip -d ./test/macos
testVersion=$(./test/macos/bwdc -v)
@@ -182,7 +176,7 @@ jobs:
fi
- name: Upload Mac Zip to GitHub
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
uses: actions/upload-artifact@65c4c4a1ddee5b72f698fdd19549f0f0fb45cf08 # v4.6.0
with:
name: bwdc-macos-${{ env._PACKAGE_VERSION }}.zip
path: ./dist-cli/bwdc-macos-${{ env._PACKAGE_VERSION }}.zip
@@ -200,16 +194,14 @@ jobs:
_NODE_VERSION: ${{ needs.setup.outputs.node_version }}
steps:
- name: Checkout repo
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
with:
persist-credentials: false
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- name: Setup Windows builder
run: |
choco install checksum --no-progress
- name: Set up Node
uses: actions/setup-node@395ad3262231945c25e8478fd5baf05154b1d79f # v6.1.0
uses: actions/setup-node@a0853c24544627f65ddf259abe73b1d18a591444 # v5.0.0
with:
cache: 'npm'
cache-dependency-path: '**/package-lock.json'
@@ -249,7 +241,7 @@ jobs:
- name: Version Test
shell: pwsh
run: |
Expand-Archive -Path "dist-cli\bwdc-windows-$env:_PACKAGE_VERSION.zip" -DestinationPath "test\windows"
Expand-Archive -Path "dist-cli\bwdc-windows-${{ env._PACKAGE_VERSION }}.zip" -DestinationPath "test\windows"
$testVersion = Invoke-Expression '& .\test\windows\bwdc.exe -v'
echo "version: ${env:_PACKAGE_VERSION}"
echo "testVersion: $testVersion"
@@ -258,7 +250,7 @@ jobs:
}
- name: Upload Windows Zip to GitHub
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
uses: actions/upload-artifact@65c4c4a1ddee5b72f698fdd19549f0f0fb45cf08 # v4.6.0
with:
name: bwdc-windows-${{ env._PACKAGE_VERSION }}.zip
path: ./dist-cli/bwdc-windows-${{ env._PACKAGE_VERSION }}.zip
@@ -279,12 +271,10 @@ jobs:
HUSKY: 0
steps:
- name: Checkout repo
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
with:
persist-credentials: false
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- name: Set up Node
uses: actions/setup-node@395ad3262231945c25e8478fd5baf05154b1d79f # v6.1.0
uses: actions/setup-node@a0853c24544627f65ddf259abe73b1d18a591444 # v5.0.0
with:
cache: 'npm'
cache-dependency-path: '**/package-lock.json'
@@ -338,28 +328,28 @@ jobs:
SIGNING_CERT_NAME: ${{ steps.retrieve-secrets.outputs.code-signing-cert-name }}
- name: Upload Portable Executable to GitHub
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
uses: actions/upload-artifact@65c4c4a1ddee5b72f698fdd19549f0f0fb45cf08 # v4.6.0
with:
name: Bitwarden-Connector-Portable-${{ env._PACKAGE_VERSION }}.exe
path: ./dist/Bitwarden-Connector-Portable-${{ env._PACKAGE_VERSION }}.exe
if-no-files-found: error
- name: Upload Installer Executable to GitHub
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
uses: actions/upload-artifact@65c4c4a1ddee5b72f698fdd19549f0f0fb45cf08 # v4.6.0
with:
name: Bitwarden-Connector-Installer-${{ env._PACKAGE_VERSION }}.exe
path: ./dist/Bitwarden-Connector-Installer-${{ env._PACKAGE_VERSION }}.exe
if-no-files-found: error
- name: Upload Installer Executable Blockmap to GitHub
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
uses: actions/upload-artifact@65c4c4a1ddee5b72f698fdd19549f0f0fb45cf08 # v4.6.0
with:
name: Bitwarden-Connector-Installer-${{ env._PACKAGE_VERSION }}.exe.blockmap
path: ./dist/Bitwarden-Connector-Installer-${{ env._PACKAGE_VERSION }}.exe.blockmap
if-no-files-found: error
- name: Upload latest auto-update artifact
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
uses: actions/upload-artifact@65c4c4a1ddee5b72f698fdd19549f0f0fb45cf08 # v4.6.0
with:
name: latest.yml
path: ./dist/latest.yml
@@ -379,12 +369,10 @@ jobs:
HUSKY: 0
steps:
- name: Checkout repo
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
with:
persist-credentials: false
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- name: Set up Node
uses: actions/setup-node@395ad3262231945c25e8478fd5baf05154b1d79f # v6.1.0
uses: actions/setup-node@a0853c24544627f65ddf259abe73b1d18a591444 # v5.0.0
with:
cache: 'npm'
cache-dependency-path: '**/package-lock.json'
@@ -393,7 +381,7 @@ jobs:
- name: Update NPM
run: |
npm install -g node-gyp
node-gyp install "$(node -v)"
node-gyp install $(node -v)
- name: Set up environment
run: |
@@ -411,14 +399,14 @@ jobs:
run: npm run dist:lin
- name: Upload AppImage
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
uses: actions/upload-artifact@65c4c4a1ddee5b72f698fdd19549f0f0fb45cf08 # v4.6.0
with:
name: Bitwarden-Connector-${{ env._PACKAGE_VERSION }}-x86_64.AppImage
path: ./dist/Bitwarden-Connector-${{ env._PACKAGE_VERSION }}-x86_64.AppImage
if-no-files-found: error
- name: Upload latest auto-update artifact
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
uses: actions/upload-artifact@65c4c4a1ddee5b72f698fdd19549f0f0fb45cf08 # v4.6.0
with:
name: latest-linux.yml
path: ./dist/latest-linux.yml
@@ -427,7 +415,7 @@ jobs:
macos-gui:
name: Build MacOS GUI
runs-on: macos-15-intel
runs-on: macos-13
needs: setup
permissions:
contents: read
@@ -439,12 +427,10 @@ jobs:
HUSKY: 0
steps:
- name: Checkout repo
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
with:
persist-credentials: false
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- name: Set up Node
uses: actions/setup-node@395ad3262231945c25e8478fd5baf05154b1d79f # v6.1.0
uses: actions/setup-node@a0853c24544627f65ddf259abe73b1d18a591444 # v5.0.0
with:
cache: 'npm'
cache-dependency-path: '**/package-lock.json'
@@ -453,7 +439,7 @@ jobs:
- name: Update NPM
run: |
npm install -g node-gyp
node-gyp install "$(node -v)"
node-gyp install $(node -v)
- name: Print environment
run: |
@@ -478,16 +464,16 @@ jobs:
- name: Get certificates
run: |
mkdir -p "$HOME/certificates"
mkdir -p $HOME/certificates
az keyvault secret show --id https://bitwarden-ci.vault.azure.net/certificates/devid-app-cert |
jq -r .value | base64 -d > "$HOME/certificates/devid-app-cert.p12"
jq -r .value | base64 -d > $HOME/certificates/devid-app-cert.p12
az keyvault secret show --id https://bitwarden-ci.vault.azure.net/certificates/devid-installer-cert |
jq -r .value | base64 -d > "$HOME/certificates/devid-installer-cert.p12"
jq -r .value | base64 -d > $HOME/certificates/devid-installer-cert.p12
az keyvault secret show --id https://bitwarden-ci.vault.azure.net/certificates/macdev-cert |
jq -r .value | base64 -d > "$HOME/certificates/macdev-cert.p12"
jq -r .value | base64 -d > $HOME/certificates/macdev-cert.p12
- name: Log out from Azure
uses: bitwarden/gh-actions/azure-logout@main
@@ -496,9 +482,9 @@ jobs:
env:
KEYCHAIN_PASSWORD: ${{ steps.get-kv-secrets.outputs.KEYCHAIN-PASSWORD }}
run: |
security create-keychain -p "$KEYCHAIN_PASSWORD" build.keychain
security create-keychain -p $KEYCHAIN_PASSWORD build.keychain
security default-keychain -s build.keychain
security unlock-keychain -p "$KEYCHAIN_PASSWORD" build.keychain
security unlock-keychain -p $KEYCHAIN_PASSWORD build.keychain
security set-keychain-settings -lut 1200 build.keychain
security import "$HOME/certificates/devid-app-cert.p12" -k build.keychain -P "" \
@@ -510,12 +496,12 @@ jobs:
security import "$HOME/certificates/macdev-cert.p12" -k build.keychain -P "" \
-T /usr/bin/codesign -T /usr/bin/security -T /usr/bin/productbuild
security set-key-partition-list -S apple-tool:,apple:,codesign: -s -k "$KEYCHAIN_PASSWORD" build.keychain
security set-key-partition-list -S apple-tool:,apple:,codesign: -s -k $KEYCHAIN_PASSWORD build.keychain
- name: Load package version
run: |
$rootPath = $env:GITHUB_WORKSPACE;
$packageVersion = (Get-Content -Raw -Path "$rootPath\package.json" | ConvertFrom-Json).version;
$packageVersion = (Get-Content -Raw -Path $rootPath\package.json | ConvertFrom-Json).version;
Write-Output "Setting package version to $packageVersion";
Write-Output "PACKAGE_VERSION=$packageVersion" | Out-File -FilePath $env:GITHUB_ENV -Encoding utf8 -Append;
@@ -525,12 +511,10 @@ jobs:
run: npm install
- name: Set up private auth key
env:
_APP_STORE_CONNECT_AUTH_KEY: ${{ steps.get-kv-secrets.outputs.APP-STORE-CONNECT-AUTH-KEY }}
run: |
mkdir ~/private_keys
cat << EOF > ~/private_keys/AuthKey_UFD296548T.p8
${_APP_STORE_CONNECT_AUTH_KEY}
${{ steps.get-kv-secrets.outputs.APP-STORE-CONNECT-AUTH-KEY }}
EOF
- name: Build application
@@ -542,28 +526,28 @@ jobs:
CSC_FOR_PULL_REQUEST: true
- name: Upload .zip artifact
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
uses: actions/upload-artifact@65c4c4a1ddee5b72f698fdd19549f0f0fb45cf08 # v4.6.0
with:
name: Bitwarden-Connector-${{ env._PACKAGE_VERSION }}-mac.zip
path: ./dist/Bitwarden-Connector-${{ env._PACKAGE_VERSION }}-mac.zip
if-no-files-found: error
- name: Upload .dmg artifact
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
uses: actions/upload-artifact@65c4c4a1ddee5b72f698fdd19549f0f0fb45cf08 # v4.6.0
with:
name: Bitwarden-Connector-${{ env._PACKAGE_VERSION }}.dmg
path: ./dist/Bitwarden-Connector-${{ env._PACKAGE_VERSION }}.dmg
if-no-files-found: error
- name: Upload .dmg Blockmap artifact
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
uses: actions/upload-artifact@65c4c4a1ddee5b72f698fdd19549f0f0fb45cf08 # v4.6.0
with:
name: Bitwarden-Connector-${{ env._PACKAGE_VERSION }}.dmg.blockmap
path: ./dist/Bitwarden-Connector-${{ env._PACKAGE_VERSION }}.dmg.blockmap
if-no-files-found: error
- name: Upload latest auto-update artifact
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # v6.0.0
uses: actions/upload-artifact@65c4c4a1ddee5b72f698fdd19549f0f0fb45cf08 # v4.6.0
with:
name: latest-mac.yml
path: ./dist/latest-mac.yml

View File

@@ -2,36 +2,25 @@ name: Integration Testing
on:
workflow_dispatch:
# Integration tests are slow, so only run them if relevant files have changed.
# This is done at the workflow level and at the job level.
# Make sure these triggers stay consistent with the 'changed-files' job.
push:
branches:
- 'main'
- 'rc'
- "main"
paths:
- ".github/workflows/integration-test.yml" # this file
- "docker-compose.yml" # any change to Docker configuration
- "package.json" # dependencies
- "utils/**" # any change to test fixtures
- "src/services/sync.service.ts" # core sync service used by all directory services
- "src/services/directory-services/ldap-directory.service*" # LDAP directory service
- "src/services/directory-services/gsuite-directory.service*" # Google Workspace directory service
# Add directory services here as we add test coverage
- "src/services/ldap-directory.service*" # we only have integration for LDAP testing at the moment
- "./openldap/**/*" # any change to test fixtures
- "./docker-compose.yml" # any change to Docker configuration
- "./package.json" # dependencies
pull_request:
paths:
- ".github/workflows/integration-test.yml" # this file
- "docker-compose.yml" # any change to Docker configuration
- "package.json" # dependencies
- "utils/**" # any change to test fixtures
- "src/services/sync.service.ts" # core sync service used by all directory services
- "src/services/directory-services/ldap-directory.service*" # LDAP directory service
- "src/services/directory-services/gsuite-directory.service*" # Google Workspace directory service
# Add directory services here as we add test coverage
- "src/services/ldap-directory.service*" # we only have integration for LDAP testing at the moment
- "./openldap/**/*" # any change to test fixtures
- "./docker-compose.yml" # any change to Docker configuration
- "./package.json" # dependencies
permissions:
contents: read
checks: write # required by dorny/test-reporter to upload its results
id-token: write # required to use OIDC to login to Azure Key Vault
jobs:
testing:
name: Run tests
@@ -40,19 +29,17 @@ jobs:
steps:
- name: Check out repo
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
with:
persist-credentials: false
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- name: Get Node version
id: retrieve-node-version
run: |
NODE_NVMRC=$(cat .nvmrc)
NODE_VERSION=${NODE_NVMRC/v/''}
echo "node_version=$NODE_VERSION" >> "$GITHUB_OUTPUT"
echo "node_version=$NODE_VERSION" >> $GITHUB_OUTPUT
- name: Set up Node
uses: actions/setup-node@395ad3262231945c25e8478fd5baf05154b1d79f # v6.1.0
uses: actions/setup-node@a0853c24544627f65ddf259abe73b1d18a591444 # v5.0.0
with:
cache: 'npm'
cache-dependency-path: '**/package-lock.json'
@@ -61,86 +48,28 @@ jobs:
- name: Install Node dependencies
run: npm ci
# Get secrets from Azure Key Vault
- name: Azure Login
uses: bitwarden/gh-actions/azure-login@main
with:
subscription_id: ${{ secrets.AZURE_SUBSCRIPTION_ID }}
tenant_id: ${{ secrets.AZURE_TENANT_ID }}
client_id: ${{ secrets.AZURE_CLIENT_ID }}
- name: Get KV Secrets
id: get-kv-secrets
uses: bitwarden/gh-actions/get-keyvault-secrets@main
with:
keyvault: gh-directory-connector
secrets: "GOOGLE-ADMIN-USER,GOOGLE-CLIENT-EMAIL,GOOGLE-DOMAIN,GOOGLE-PRIVATE-KEY"
- name: Azure Logout
uses: bitwarden/gh-actions/azure-logout@main
# Only run relevant tests depending on what files have changed.
# This should be kept consistent with the workflow level triggers.
# Note: docker-compose.yml is only used for ldap for now
- name: Get changed files
id: changed-files
uses: dorny/paths-filter@de90cc6fb38fc0963ad72b210f1f284cd68cea36 # v3.0.2
with:
list-files: shell
token: ${{ secrets.GITHUB_TOKEN }}
# Add directory services here as we add test coverage
filters: |
common:
- '.github/workflows/integration-test.yml'
- 'utils/**'
- 'package.json'
- 'src/services/sync.service.ts'
ldap:
- 'docker-compose.yml'
- 'src/services/directory-services/ldap-directory.service*'
google:
- 'src/services/directory-services/gsuite-directory.service*'
# LDAP
- name: Setup LDAP integration tests
if: steps.changed-files.outputs.common == 'true' || steps.changed-files.outputs.ldap == 'true'
- name: Install mkcert
run: |
sudo apt-get update
sudo apt-get -y install mkcert
npm run test:integration:setup
- name: Run LDAP integration tests
if: steps.changed-files.outputs.common == 'true' || steps.changed-files.outputs.ldap == 'true'
env:
JEST_JUNIT_UNIQUE_OUTPUT_NAME: "true" # avoids junit outputs from clashing
run: npx jest ldap-directory.service.integration.spec.ts --coverage --coverageDirectory=coverage-ldap
- name: Setup integration tests
run: npm run test:integration:setup
# Google Workspace
- name: Run Google Workspace integration tests
if: steps.changed-files.outputs.common == 'true' || steps.changed-files.outputs.google == 'true'
env:
GOOGLE_DOMAIN: ${{ steps.get-kv-secrets.outputs.GOOGLE-DOMAIN }}
GOOGLE_ADMIN_USER: ${{ steps.get-kv-secrets.outputs.GOOGLE-ADMIN-USER }}
GOOGLE_CLIENT_EMAIL: ${{ steps.get-kv-secrets.outputs.GOOGLE-CLIENT-EMAIL }}
GOOGLE_PRIVATE_KEY: ${{ steps.get-kv-secrets.outputs.GOOGLE-PRIVATE-KEY }}
JEST_JUNIT_UNIQUE_OUTPUT_NAME: "true" # avoids junit outputs from clashing
run: |
npx jest gsuite-directory.service.integration.spec.ts --coverage --coverageDirectory=coverage-google
- name: Run integration tests
run: npm run test:integration --coverage
- name: Report test results
id: report
uses: dorny/test-reporter@fe45e9537387dac839af0d33ba56eed8e24189e8 # v2.3.0
# This will skip the job if it's a pull request from a fork, because that won't have permission to upload test results.
# PRs from the repository and all other events are OK.
if: (github.event_name == 'push' || github.event_name == 'workflow_dispatch' || github.event.pull_request.head.repo.full_name == github.repository) && !cancelled()
uses: dorny/test-reporter@dc3a92680fcc15842eef52e8c4606ea7ce6bd3f3 # v2.1.1
if: ${{ github.event.pull_request.head.repo.full_name == github.repository && !cancelled() }}
with:
name: Test Results
path: "junit.xml*"
path: "junit.xml"
reporter: jest-junit
fail-on-error: true
- name: Upload coverage to codecov.io
uses: codecov/codecov-action@671740ac38dd9b0130fbe1cec585b89eea48d3de # v5.5.2
uses: codecov/codecov-action@5a605bd92782ce0810fa3b8acc235c921b497052 # v5.2.0
- name: Upload results to codecov.io
uses: codecov/test-results-action@0fa95f0e1eeaafde2c782583b36b28ad0d8c77d3 # v1.2.1
uses: codecov/test-results-action@4e79e65778be1cecd5df25e14af1eafb6df80ea9 # v1.0.2

View File

@@ -26,9 +26,7 @@ jobs:
release_version: ${{ steps.version.outputs.version }}
steps:
- name: Checkout repo
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
with:
persist-credentials: false
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- name: Branch check
if: ${{ inputs.release_type != 'Dry Run' }}
@@ -75,7 +73,7 @@ jobs:
- name: Create release
if: ${{ inputs.release_type != 'Dry Run' }}
uses: ncipollo/release-action@b7eabc95ff50cbeeedec83973935c8f306dfcd0b # v1.20.0
uses: ncipollo/release-action@cdcc88a9acf3ca41c16c37bb7d21b9ad48560d87 # v1.15.0
env:
PKG_VERSION: ${{ needs.setup.outputs.release_version }}
with:

View File

@@ -1,28 +0,0 @@
name: Respond
on:
issue_comment:
types: [created]
pull_request_review_comment:
types: [created]
issues:
types: [opened, assigned]
pull_request_review:
types: [submitted]
permissions: {}
jobs:
respond:
name: Respond
uses: bitwarden/gh-actions/.github/workflows/_respond.yml@main
secrets:
AZURE_SUBSCRIPTION_ID: ${{ secrets.AZURE_SUBSCRIPTION_ID }}
AZURE_TENANT_ID: ${{ secrets.AZURE_TENANT_ID }}
AZURE_CLIENT_ID: ${{ secrets.AZURE_CLIENT_ID }}
permissions:
actions: read
contents: write
id-token: write
issues: write
pull-requests: write

View File

@@ -1,21 +0,0 @@
name: Code Review
on:
pull_request:
types: [opened, synchronize, reopened]
permissions: {}
jobs:
review:
name: Review
uses: bitwarden/gh-actions/.github/workflows/_review-code.yml@main
secrets:
AZURE_SUBSCRIPTION_ID: ${{ secrets.AZURE_SUBSCRIPTION_ID }}
AZURE_TENANT_ID: ${{ secrets.AZURE_TENANT_ID }}
AZURE_CLIENT_ID: ${{ secrets.AZURE_CLIENT_ID }}
permissions:
actions: read
contents: read
id-token: write
pull-requests: write

View File

@@ -22,19 +22,17 @@ jobs:
steps:
- name: Check out repo
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
with:
persist-credentials: false
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- name: Get Node version
id: retrieve-node-version
run: |
NODE_NVMRC=$(cat .nvmrc)
NODE_VERSION=${NODE_NVMRC/v/''}
echo "node_version=$NODE_VERSION" >> "$GITHUB_OUTPUT"
echo "node_version=$NODE_VERSION" >> $GITHUB_OUTPUT
- name: Set up Node
uses: actions/setup-node@395ad3262231945c25e8478fd5baf05154b1d79f # v6.1.0
uses: actions/setup-node@a0853c24544627f65ddf259abe73b1d18a591444 # v5.0.0
with:
cache: 'npm'
cache-dependency-path: '**/package-lock.json'
@@ -53,10 +51,8 @@ jobs:
run: npm run test --coverage
- name: Report test results
uses: dorny/test-reporter@fe45e9537387dac839af0d33ba56eed8e24189e8 # v2.3.0
# This will skip the job if it's a pull request from a fork, because that won't have permission to upload test results.
# PRs from the repository and all other events are OK.
if: (github.event_name == 'push' || github.event_name == 'workflow_dispatch' || github.event.pull_request.head.repo.full_name == github.repository) && !cancelled()
uses: dorny/test-reporter@dc3a92680fcc15842eef52e8c4606ea7ce6bd3f3 # v2.1.1
if: ${{ github.event.pull_request.head.repo.full_name == github.repository && !cancelled() }}
with:
name: Test Results
path: "junit.xml"
@@ -64,7 +60,7 @@ jobs:
fail-on-error: true
- name: Upload coverage to codecov.io
uses: codecov/codecov-action@671740ac38dd9b0130fbe1cec585b89eea48d3de # v5.5.2
uses: codecov/codecov-action@5a605bd92782ce0810fa3b8acc235c921b497052 # v5.2.0
- name: Upload results to codecov.io
uses: codecov/test-results-action@0fa95f0e1eeaafde2c782583b36b28ad0d8c77d3 # v1.2.1
uses: codecov/test-results-action@4e79e65778be1cecd5df25e14af1eafb6df80ea9 # v1.0.2

View File

@@ -42,18 +42,16 @@ jobs:
uses: bitwarden/gh-actions/azure-logout@main
- name: Generate GH App token
uses: actions/create-github-app-token@29824e69f54612133e76f7eaac726eef6c875baf # v2.2.1
uses: actions/create-github-app-token@df432ceedc7162793a195dd1713ff69aefc7379e # v2.0.6
id: app-token
with:
app-id: ${{ steps.get-kv-secrets.outputs.BW-GHAPP-ID }}
private-key: ${{ steps.get-kv-secrets.outputs.BW-GHAPP-KEY }}
permission-contents: write
- name: Checkout Branch
uses: actions/checkout@8e8c483db84b4bee98b60c0593521ed34d9990e8 # v6.0.1
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
with:
token: ${{ steps.app-token.outputs.token }}
persist-credentials: true
- name: Setup git
run: |
@@ -64,7 +62,7 @@ jobs:
id: current-version
run: |
CURRENT_VERSION=$(cat package.json | jq -r '.version')
echo "version=$CURRENT_VERSION" >> "$GITHUB_OUTPUT"
echo "version=$CURRENT_VERSION" >> $GITHUB_OUTPUT
- name: Verify input version
if: ${{ inputs.version_number_override != '' }}
@@ -79,7 +77,8 @@ jobs:
fi
# Check if version is newer.
if printf '%s\n' "${CURRENT_VERSION}" "${NEW_VERSION}" | sort -C -V; then
printf '%s\n' "${CURRENT_VERSION}" "${NEW_VERSION}" | sort -C -V
if [ $? -eq 0 ]; then
echo "Version check successful."
else
echo "Version check failed."
@@ -111,34 +110,26 @@ jobs:
- name: Set final version output
id: set-final-version-output
env:
_BUMP_VERSION_OVERRIDE_OUTCOME: ${{ steps.bump-version-override.outcome }}
_INPUT_VERSION_NUMBER_OVERRIDE: ${{ inputs.version_number_override }}
_BUMP_VERSION_AUTOMATIC_OUTCOME: ${{ steps.bump-version-automatic.outcome }}
_CALCULATE_NEXT_VERSION: ${{ steps.calculate-next-version.outputs.version }}
run: |
if [[ "$_BUMP_VERSION_OVERRIDE_OUTCOME" == "success" ]]; then
echo "version=$_INPUT_VERSION_NUMBER_OVERRIDE" >> "$GITHUB_OUTPUT"
elif [[ "$_BUMP_VERSION_AUTOMATIC_OUTCOME" == "success" ]]; then
echo "version=$_CALCULATE_NEXT_VERSION" >> "$GITHUB_OUTPUT"
if [[ "${{ steps.bump-version-override.outcome }}" == "success" ]]; then
echo "version=${{ inputs.version_number_override }}" >> $GITHUB_OUTPUT
elif [[ "${{ steps.bump-version-automatic.outcome }}" == "success" ]]; then
echo "version=${{ steps.calculate-next-version.outputs.version }}" >> $GITHUB_OUTPUT
fi
- name: Check if version changed
id: version-changed
run: |
if [ -n "$(git status --porcelain)" ]; then
echo "changes_to_commit=TRUE" >> "$GITHUB_OUTPUT"
echo "changes_to_commit=TRUE" >> $GITHUB_OUTPUT
else
echo "changes_to_commit=FALSE" >> "$GITHUB_OUTPUT"
echo "changes_to_commit=FALSE" >> $GITHUB_OUTPUT
echo "No changes to commit!";
fi
- name: Commit files
if: ${{ steps.version-changed.outputs.changes_to_commit == 'TRUE' }}
env:
_VERSION: ${{ steps.set-final-version-output.outputs.version }}
run: git commit -m "Bumped version to $_VERSION" -a
run: git commit -m "Bumped version to ${{ steps.set-final-version-output.outputs.version }}" -a
- name: Push changes
if: ${{ steps.version-changed.outputs.changes_to_commit == 'TRUE' }}

7
.gitignore vendored
View File

@@ -2,9 +2,6 @@
.DS_Store
Thumbs.db
# Environment variables used for tests
.env
# IDEs and editors
.idea/
.project
@@ -33,8 +30,8 @@ build-cli
.angular/cache
# Testing
coverage*
junit.xml*
coverage
junit.xml
# Misc
*.crx

View File

@@ -1,6 +1,6 @@
services:
open-ldap:
image: bitnamilegacy/openldap:latest
image: bitnami/openldap:latest
hostname: openldap
environment:
- LDAP_ADMIN_USERNAME=admin
@@ -11,8 +11,8 @@ services:
- LDAP_TLS_KEY_FILE=/certs/openldap-key.pem
- LDAP_TLS_CA_FILE=/certs/rootCA.pem
volumes:
- "./utils/openldap/ldifs:/ldifs"
- "./utils/openldap/certs:/certs"
- "./openldap/ldifs:/ldifs"
- "./openldap/certs:/certs"
ports:
- "1389:1389"
- "1636:1636"

View File

@@ -1,300 +0,0 @@
# Google Workspace Directory Integration
This document provides technical documentation for the Google Workspace (formerly G Suite) directory integration in Bitwarden Directory Connector.
## Overview
The Google Workspace integration synchronizes users and groups from Google Workspace to Bitwarden organizations using the Google Admin SDK Directory API. The service uses a service account with domain-wide delegation to authenticate and access directory data.
## Architecture
### Service Location
- **Implementation**: `src/services/directory-services/gsuite-directory.service.ts`
- **Configuration Model**: `src/models/gsuiteConfiguration.ts`
- **Integration Tests**: `src/services/directory-services/gsuite-directory.service.integration.spec.ts`
### Authentication Flow
The Google Workspace integration uses **OAuth 2.0 with Service Accounts** and domain-wide delegation:
1. A service account is created in Google Cloud Console
2. The service account is granted domain-wide delegation authority
3. The service account is authorized for specific OAuth scopes in Google Workspace Admin Console
4. The Directory Connector uses the service account's private key to generate JWT tokens
5. JWT tokens are exchanged for access tokens to call the Admin SDK APIs
### Required OAuth Scopes
The service account must be granted the following OAuth 2.0 scopes:
```
https://www.googleapis.com/auth/admin.directory.user.readonly
https://www.googleapis.com/auth/admin.directory.group.readonly
https://www.googleapis.com/auth/admin.directory.group.member.readonly
```
## Configuration
### Required Fields
| Field | Description |
| ------------- | --------------------------------------------------------------------------------------- |
| `clientEmail` | Service account email address (e.g., `service-account@project.iam.gserviceaccount.com`) |
| `privateKey` | Service account private key in PEM format |
| `adminUser` | Admin user email to impersonate for domain-wide delegation |
| `domain` | Primary domain of the Google Workspace organization |
### Optional Fields
| Field | Description |
| ---------- | ---------------------------------------------------------- |
| `customer` | Customer ID for multi-domain organizations (rarely needed) |
### Example Configuration
```typescript
{
clientEmail: "directory-connector@my-project.iam.gserviceaccount.com",
privateKey: "-----BEGIN PRIVATE KEY-----\n...\n-----END PRIVATE KEY-----\n",
adminUser: "admin@example.com",
domain: "example.com",
customer: "" // Usually not required
}
```
## Setup Instructions
### 1. Create a Service Account
1. Go to [Google Cloud Console](https://console.cloud.google.com)
2. Create or select a project
3. Navigate to **IAM & Admin** > **Service Accounts**
4. Click **Create Service Account**
5. Enter a name and description
6. Click **Create and Continue**
7. Skip granting roles (not needed for this use case)
8. Click **Done**
### 2. Generate Service Account Key
1. Click on the newly created service account
2. Navigate to the **Keys** tab
3. Click **Add Key** > **Create new key**
4. Select **JSON** format
5. Click **Create** and download the key file
6. Extract `client_email` and `private_key` from the JSON file
### 3. Enable Domain-Wide Delegation
1. In the service account details, click **Show Advanced Settings**
2. Under **Domain-wide delegation**, click **Enable Google Workspace Domain-wide Delegation**
3. Note the **Client ID** (numeric ID)
### 4. Authorize the Service Account in Google Workspace
1. Go to [Google Workspace Admin Console](https://admin.google.com)
2. Navigate to **Security** > **API Controls** > **Domain-wide Delegation**
3. Click **Add new**
4. Enter the **Client ID** from step 3
5. Enter the following OAuth scopes (comma-separated):
```
https://www.googleapis.com/auth/admin.directory.user.readonly,
https://www.googleapis.com/auth/admin.directory.group.readonly,
https://www.googleapis.com/auth/admin.directory.group.member.readonly
```
6. Click **Authorize**
### 5. Configure Directory Connector
Use the extracted values to configure the Directory Connector:
- **Client Email**: From `client_email` in the JSON key file
- **Private Key**: From `private_key` in the JSON key file (keep the `\n` line breaks)
- **Admin User**: Email of a super admin user in your Google Workspace domain
- **Domain**: Your primary Google Workspace domain
## Sync Behavior
### User Synchronization
The service synchronizes the following user attributes:
| Google Workspace Field | Bitwarden Field | Notes |
| ------------------------- | --------------------------- | ----------------------------------------- |
| `id` | `referenceId`, `externalId` | User's unique Google ID |
| `primaryEmail` | `email` | Normalized to lowercase |
| `suspended` OR `archived` | `disabled` | User is disabled if suspended or archived |
| Deleted status | `deleted` | Set to true for deleted users |
**Special Behavior:**
- The service queries both **active users** and **deleted users** separately
- Suspended and archived users are included but marked as disabled
- Deleted users are included with the `deleted` flag set to true
### Group Synchronization
The service synchronizes the following group attributes:
| Google Workspace Field | Bitwarden Field | Notes |
| ----------------------- | --------------------------- | ------------------------ |
| `id` | `referenceId`, `externalId` | Group's unique Google ID |
| `name` | `name` | Group display name |
| Members (type=USER) | `userMemberExternalIds` | Individual user members |
| Members (type=GROUP) | `groupMemberReferenceIds` | Nested group members |
| Members (type=CUSTOMER) | `userMemberExternalIds` | All domain users |
**Member Types:**
- **USER**: Individual user accounts (only ACTIVE status users are synced)
- **GROUP**: Nested groups (allows group hierarchy)
- **CUSTOMER**: Special member type that includes all users in the domain
### Filtering
#### User Filter Examples
```
exclude:testuser1@bwrox.dev | testuser1@bwrox.dev # Exclude multiple users
|orgUnitPath='/Integration testing' # Users in Integration testing Organizational unit (OU)
exclude:testuser1@bwrox.dev | orgUnitPath='/Integration testing' # Combined filter: get users in OU excluding provided user
|email:testuser* # Users with email starting with "testuser"
```
#### Group Filter Examples
An important note for group filters is that it implicitly only syncs users that are in groups. For example, in the case of
the integration test data, `admin@bwrox.dev` is not a member of any group. Therefore, the first example filter below will
also implicitly exclude `admin@bwrox.dev`, who is not in any group. This is important because when it is paired with an
empty user filter, this query may semantically be understood as "sync everyone not in Integration Test Group A," while in
practice it means "Only sync members of groups not in integration Test Groups A."
```
exclude:Integration Test Group A # Get all users in groups excluding the provided group.
```
### User AND Group Filter Examples
```
```
**Filter Syntax:**
- Prefix with `|` for custom filters
- Use `:` for pattern matching (supports `*` wildcard)
- Combine multiple conditions with spaces (AND logic)
### Pagination
The service automatically handles pagination for all API calls:
- Users API (active and deleted)
- Groups API
- Group Members API
Each API call processes all pages using the `nextPageToken` mechanism until no more results are available.
## Error Handling
### Common Errors
| Error | Cause | Resolution |
| ---------------------- | ------------------------------------- | ---------------------------------------------------------- |
| "dirConfigIncomplete" | Missing required configuration fields | Verify all required fields are provided |
| "authenticationFailed" | Invalid credentials or unauthorized | Check service account key and domain-wide delegation setup |
| API returns 401/403 | Missing OAuth scopes | Verify scopes are authorized in Admin Console |
| API returns 404 | Invalid domain or customer ID | Check domain configuration |
### Security Considerations
The service implements the following security measures:
1. **Credential sanitization**: Error messages do not expose private keys or sensitive credentials
2. **Secure authentication**: Uses OAuth 2.0 with JWT tokens, not API keys
3. **Read-only access**: Only requires read-only scopes for directory data
4. **No credential logging**: Service account credentials are not logged
## Testing
### Integration Tests
Integration tests are located in `src/services/directory-services/gsuite-directory.service.integration.spec.ts`.
**Test Coverage:**
- Basic sync (users and groups)
- Sync with filters
- Users-only sync
- Groups-only sync
- User filtering scenarios
- Group filtering scenarios
- Disabled users handling
- Group membership scenarios
- Error handling
**Running Integration Tests:**
Integration tests require live Google Workspace credentials:
1. Create a `.env` file in the `utils/` folder with:
```
GOOGLE_ADMIN_USER=admin@example.com
GOOGLE_CLIENT_EMAIL=service-account@project.iam.gserviceaccount.com
GOOGLE_PRIVATE_KEY="-----BEGIN PRIVATE KEY-----\n...\n-----END PRIVATE KEY-----\n"
GOOGLE_DOMAIN=example.com
```
2. Run tests:
```bash
# Run all integration tests (includes LDAP, Google Workspace, etc.)
npm run test:integration
# Run only Google Workspace integration tests
npx jest gsuite-directory.service.integration.spec.ts
```
**Test Data:**
The integration tests expect specific test data in Google Workspace:
- **Users**: 5 test users in organizational unit `/Integration testing`
- testuser1@bwrox.dev (in Group A)
- testuser2@bwrox.dev (in Groups A & B)
- testuser3@bwrox.dev (in Group B)
- testuser4@bwrox.dev (no groups)
- testuser5@bwrox.dev (disabled)
- **Groups**: 2 test groups with name pattern `Integration*`
- Integration Test Group A
- Integration Test Group B
## API Reference
### Google Admin SDK APIs Used
- **Users API**: `admin.users.list()`
- [Documentation](https://developers.google.com/admin-sdk/directory/reference/rest/v1/users/list)
- **Groups API**: `admin.groups.list()`
- [Documentation](https://developers.google.com/admin-sdk/directory/reference/rest/v1/groups/list)
- **Members API**: `admin.members.list()`
- [Documentation](https://developers.google.com/admin-sdk/directory/reference/rest/v1/members/list)
### Rate Limits
Google Workspace Directory API rate limits:
- Default: 2,400 queries per minute per user, per Google Cloud Project
The service does not implement rate limiting logic; it relies on API error responses.
## Resources
- [Google Admin SDK Directory API Guide](https://developers.google.com/admin-sdk/directory/v1/guides)
- [Service Account Authentication](https://developers.google.com/identity/protocols/oauth2/service-account)
- [Domain-wide Delegation](https://support.google.com/a/answer/162106)
- [Google Workspace Admin Console](https://admin.google.com)
- [Bitwarden Directory Connector Documentation](https://bitwarden.com/help/directory-sync/)

View File

@@ -4,7 +4,7 @@
},
"productName": "Bitwarden Directory Connector",
"appId": "com.bitwarden.directory-connector",
"copyright": "Copyright © 2015-2026 Bitwarden Inc.",
"copyright": "Copyright © 2015-2022 Bitwarden Inc.",
"directories": {
"buildResources": "resources",
"output": "dist",

View File

@@ -1,149 +0,0 @@
// @ts-check
import eslint from "@eslint/js";
import tsParser from "@typescript-eslint/parser";
import tsPlugin from "@typescript-eslint/eslint-plugin";
import prettierConfig from "eslint-config-prettier";
import importPlugin from "eslint-plugin-import";
import rxjsX from "eslint-plugin-rxjs-x";
import rxjsAngularX from "eslint-plugin-rxjs-angular-x";
import angularEslint from "@angular-eslint/eslint-plugin-template";
import angularParser from "@angular-eslint/template-parser";
import globals from "globals";
export default [
// Global ignores (replaces .eslintignore)
{
ignores: [
"dist/**",
"dist-cli/**",
"build/**",
"build-cli/**",
"coverage/**",
"**/*.cjs",
"eslint.config.mjs",
"scripts/**/*.js",
"**/node_modules/**",
],
},
// Base config for all JavaScript/TypeScript files
{
files: ["**/*.ts", "**/*.js"],
languageOptions: {
ecmaVersion: 2020,
sourceType: "module",
parser: tsParser,
parserOptions: {
project: ["./tsconfig.eslint.json"],
},
globals: {
...globals.browser,
...globals.node,
},
},
plugins: {
"@typescript-eslint": tsPlugin,
import: importPlugin,
"rxjs-x": rxjsX,
"rxjs-angular-x": rxjsAngularX,
},
settings: {
"import/parsers": {
"@typescript-eslint/parser": [".ts"],
},
"import/resolver": {
typescript: {
alwaysTryTypes: true,
},
},
},
rules: {
// ESLint recommended rules
...eslint.configs.recommended.rules,
// TypeScript ESLint recommended rules
...tsPlugin.configs.recommended.rules,
// Import plugin recommended rules
...importPlugin.flatConfigs.recommended.rules,
// RxJS recommended rules
...rxjsX.configs.recommended.rules,
// Custom project rules
"@typescript-eslint/explicit-member-accessibility": ["error", { accessibility: "no-public" }],
"@typescript-eslint/no-explicit-any": "off", // TODO: This should be re-enabled
"@typescript-eslint/no-misused-promises": ["error", { checksVoidReturn: false }],
"@typescript-eslint/no-this-alias": ["error", { allowedNames: ["self"] }],
"@typescript-eslint/no-unused-vars": ["error", { args: "none" }],
"no-console": "error",
"import/no-unresolved": "off", // TODO: Look into turning on once each package is an actual package.
"import/order": [
"error",
{
alphabetize: {
order: "asc",
},
"newlines-between": "always",
pathGroups: [
{
pattern: "@/jslib/**/*",
group: "external",
position: "after",
},
{
pattern: "@/src/**/*",
group: "parent",
position: "before",
},
],
pathGroupsExcludedImportTypes: ["builtin"],
},
],
"rxjs-angular-x/prefer-takeuntil": "error",
"rxjs-x/no-exposed-subjects": ["error", { allowProtected: true }],
"no-restricted-syntax": [
"error",
{
message: "Calling `svgIcon` directly is not allowed",
selector: "CallExpression[callee.name='svgIcon']",
},
{
message: "Accessing FormGroup using `get` is not allowed, use `.value` instead",
selector:
"ChainExpression[expression.object.callee.property.name='get'][expression.property.name='value']",
},
],
curly: ["error", "all"],
"import/namespace": ["off"], // This doesn't resolve namespace imports correctly, but TS will throw for this anyway
"no-restricted-imports": ["error", { patterns: ["src/**/*"] }],
},
},
// Jest test files (includes any test-related files)
{
files: ["**/*.spec.ts", "**/test.setup.ts", "**/spec/**/*.ts", "**/utils/**/*fixtures*.ts"],
languageOptions: {
globals: {
...globals.jest,
},
},
},
// Angular HTML templates
{
files: ["**/*.html"],
languageOptions: {
parser: angularParser,
},
plugins: {
"@angular-eslint/template": angularEslint,
},
rules: {
"@angular-eslint/template/button-has-type": "error",
},
},
// Prettier config (must be last to override other configs)
prettierConfig,
];

View File

@@ -26,6 +26,7 @@ module.exports = {
modulePaths: [compilerOptions.baseUrl],
moduleNameMapper: pathsToModuleNameMapper(compilerOptions.paths, { prefix: "<rootDir>/" }),
setupFilesAfterEnv: ["<rootDir>/test.setup.ts"],
// Workaround for a memory leak that crashes tests in CI:
// https://github.com/facebook/jest/issues/9430#issuecomment-1149882002
// Also anecdotally improves performance when run locally

View File

@@ -1,4 +1,4 @@
import { InjectOptions, Injector, ProviderToken } from "@angular/core";
import { InjectFlags, InjectOptions, Injector, ProviderToken } from "@angular/core";
export class ModalInjector implements Injector {
constructor(
@@ -12,7 +12,8 @@ export class ModalInjector implements Injector {
options: InjectOptions & { optional?: false },
): T;
get<T>(token: ProviderToken<T>, notFoundValue: null, options: InjectOptions): T;
get<T>(token: ProviderToken<T>, notFoundValue?: T, options?: InjectOptions): T;
get<T>(token: ProviderToken<T>, notFoundValue?: T, options?: InjectOptions | InjectFlags): T;
get<T>(token: ProviderToken<T>, notFoundValue?: T, flags?: InjectFlags): T;
get(token: any, notFoundValue?: any): any;
get(token: any, notFoundValue?: any, flags?: any): any {
return this._additionalTokens.get(token) ?? this._parentInjector.get<any>(token, notFoundValue);

View File

@@ -1,4 +1,5 @@
import { lastValueFrom, Observable, Subject } from "rxjs";
import { Observable, Subject } from "rxjs";
import { first } from "rxjs/operators";
export class ModalRef {
onCreated: Observable<HTMLElement>; // Modal added to the DOM.
@@ -44,6 +45,6 @@ export class ModalRef {
}
onClosedPromise(): Promise<any> {
return lastValueFrom(this.onClosed);
return this.onClosed.pipe(first()).toPromise();
}
}

View File

@@ -1,5 +1,5 @@
import { Directive, ElementRef, Input, NgZone } from "@angular/core";
import { take } from "rxjs";
import { take } from "rxjs/operators";
import { Utils } from "@/jslib/common/src/misc/utils";

View File

@@ -9,7 +9,7 @@ import {
Type,
ViewContainerRef,
} from "@angular/core";
import { first, firstValueFrom } from "rxjs";
import { first } from "rxjs/operators";
import { DynamicModalComponent } from "../components/modal/dynamic-modal.component";
import { ModalInjector } from "../components/modal/modal-injector";
@@ -58,7 +58,7 @@ export class ModalService {
viewContainerRef.insert(modalComponentRef.hostView);
await firstValueFrom(modalRef.onCreated);
await modalRef.onCreated.pipe(first()).toPromise();
return [modalRef, modalComponentRef.instance.componentRef.instance];
}

View File

@@ -8,12 +8,15 @@ declare let console: any;
export function interceptConsole(interceptions: any): object {
console = {
log: function () {
// eslint-disable-next-line
interceptions.log = arguments;
},
warn: function () {
// eslint-disable-next-line
interceptions.warn = arguments;
},
error: function () {
// eslint-disable-next-line
interceptions.error = arguments;
},
};

View File

@@ -33,5 +33,5 @@ export function makeStaticByteArray(length: number, start = 0) {
for (let i = 0; i < length; i++) {
arr[i] = start + i;
}
return arr.buffer;
return arr;
}

View File

@@ -26,4 +26,9 @@ export class NodeUtils {
.on("error", (err) => reject(err));
});
}
// https://stackoverflow.com/a/31394257
static bufferToArrayBuffer(buf: Buffer): ArrayBuffer {
return buf.buffer.slice(buf.byteOffset, buf.byteOffset + buf.byteLength);
}
}

View File

@@ -1,11 +1,9 @@
/* eslint-disable no-useless-escape */
import * as url from "url";
import { I18nService } from "../abstractions/i18n.service";
import * as tldjs from "tldjs";
const nodeURL = typeof window === "undefined" ? url : null;
const nodeURL = typeof window === "undefined" ? require("url") : null;
export class Utils {
static inited = false;
@@ -36,7 +34,7 @@ export class Utils {
Utils.global = Utils.isNode && !Utils.isBrowser ? global : window;
}
static fromB64ToArray(str: string): Uint8Array<ArrayBuffer> {
static fromB64ToArray(str: string): Uint8Array {
if (Utils.isNode) {
return new Uint8Array(Buffer.from(str, "base64"));
} else {
@@ -49,11 +47,11 @@ export class Utils {
}
}
static fromUrlB64ToArray(str: string): Uint8Array<ArrayBuffer> {
static fromUrlB64ToArray(str: string): Uint8Array {
return Utils.fromB64ToArray(Utils.fromUrlB64ToB64(str));
}
static fromHexToArray(str: string): Uint8Array<ArrayBuffer> {
static fromHexToArray(str: string): Uint8Array {
if (Utils.isNode) {
return new Uint8Array(Buffer.from(str, "hex"));
} else {
@@ -65,7 +63,7 @@ export class Utils {
}
}
static fromUtf8ToArray(str: string): Uint8Array<ArrayBuffer> {
static fromUtf8ToArray(str: string): Uint8Array {
if (Utils.isNode) {
return new Uint8Array(Buffer.from(str, "utf8"));
} else {
@@ -78,7 +76,7 @@ export class Utils {
}
}
static fromByteStringToArray(str: string): Uint8Array<ArrayBuffer> {
static fromByteStringToArray(str: string): Uint8Array {
const arr = new Uint8Array(str.length);
for (let i = 0; i < str.length; i++) {
arr[i] = str.charCodeAt(i);
@@ -99,8 +97,8 @@ export class Utils {
}
}
static fromBufferToUrlB64(buffer: Uint8Array<ArrayBuffer>): string {
return Utils.fromB64toUrlB64(Utils.fromBufferToB64(buffer.buffer));
static fromBufferToUrlB64(buffer: ArrayBuffer): string {
return Utils.fromB64toUrlB64(Utils.fromBufferToB64(buffer));
}
static fromB64toUrlB64(b64Str: string) {
@@ -249,7 +247,7 @@ export class Utils {
const urlDomain =
tldjs != null && tldjs.getDomain != null ? tldjs.getDomain(url.hostname) : null;
return urlDomain != null ? urlDomain : url.hostname;
} catch {
} catch (e) {
// Invalid domain, try another approach below.
}
}
@@ -397,7 +395,7 @@ export class Utils {
anchor.href = uriString;
return anchor as any;
}
} catch {
} catch (e) {
// Ignore error
}

View File

@@ -53,7 +53,7 @@ export class EncString {
try {
this.encryptionType = parseInt(headerPieces[0], null);
encPieces = headerPieces[1].split("|");
} catch {
} catch (e) {
return;
}
} else {
@@ -114,7 +114,7 @@ export class EncString {
key = await cryptoService.getOrgKey(orgId);
}
this.decryptedValue = await cryptoService.decryptToUtf8(this, key);
} catch {
} catch (e) {
this.decryptedValue = "[error: cannot decrypt]";
}
return this.decryptedValue;

View File

@@ -1,4 +1,5 @@
import { ClientType } from "../../../enums/clientType";
import { Utils } from "../../../misc/utils";
import { CaptchaProtectedRequest } from "../captchaProtectedRequest";
import { DeviceRequest } from "../deviceRequest";
@@ -29,4 +30,5 @@ export class PasswordTokenRequest extends TokenRequest implements CaptchaProtect
return obj;
}
}

View File

@@ -12,6 +12,7 @@ export abstract class TokenRequest {
this.device = device != null ? device : null;
}
// eslint-disable-next-line
alterIdentityTokenHeaders(headers: Headers) {
// Implemented in subclass if required
}

View File

@@ -335,11 +335,9 @@ export class CryptoService implements CryptoServiceAbstraction {
}
async clearStoredKey(keySuffix: KeySuffixOptions) {
if (keySuffix === KeySuffixOptions.Auto) {
await this.stateService.setCryptoMasterKeyAuto(null);
} else {
await this.stateService.setCryptoMasterKeyBiometric(null);
}
keySuffix === KeySuffixOptions.Auto
? await this.stateService.setCryptoMasterKeyAuto(null)
: await this.stateService.setCryptoMasterKeyBiometric(null);
}
async clearKeyHash(userId?: string): Promise<any> {
@@ -636,9 +634,9 @@ export class CryptoService implements CryptoServiceAbstraction {
const encBytes = new Uint8Array(encBuf);
const encType = encBytes[0];
let ctBytes: Uint8Array<ArrayBuffer> = null;
let ivBytes: Uint8Array<ArrayBuffer> = null;
let macBytes: Uint8Array<ArrayBuffer> = null;
let ctBytes: Uint8Array = null;
let ivBytes: Uint8Array = null;
let macBytes: Uint8Array = null;
switch (encType) {
case EncryptionType.AesCbc128_HmacSha256_B64:
@@ -719,7 +717,7 @@ export class CryptoService implements CryptoServiceAbstraction {
const privateKey = await this.decryptToBytes(new EncString(encPrivateKey), encKey);
await this.cryptoFunctionService.rsaExtractPublicKey(privateKey);
} catch {
} catch (e) {
return false;
}

View File

@@ -38,7 +38,8 @@ const partialKeys = {
export class StateService<
TGlobalState extends GlobalState = GlobalState,
TAccount extends Account = Account,
> implements StateServiceAbstraction<TAccount> {
> implements StateServiceAbstraction<TAccount>
{
protected accountsSubject = new BehaviorSubject<{ [userId: string]: TAccount }>({});
accounts$ = this.accountsSubject.asObservable();

View File

@@ -1,14 +1,6 @@
import * as path from "path";
import {
app,
BrowserWindow,
Menu,
MenuItemConstructorOptions,
NativeImage,
nativeImage,
Tray,
} from "electron";
import { app, BrowserWindow, Menu, MenuItemConstructorOptions, nativeImage, Tray } from "electron";
import { I18nService } from "@/jslib/common/src/abstractions/i18n.service";
import { StateService } from "@/jslib/common/src/abstractions/state.service";
@@ -20,8 +12,8 @@ export class TrayMain {
private appName: string;
private tray: Tray;
private icon: string | NativeImage;
private pressedIcon: NativeImage;
private icon: string | Electron.NativeImage;
private pressedIcon: Electron.NativeImage;
constructor(
private windowMain: WindowMain,

View File

@@ -1,7 +1,7 @@
import * as path from "path";
import * as url from "url";
import { app, BrowserWindow, Rectangle, screen } from "electron";
import { app, BrowserWindow, screen } from "electron";
import { LogService } from "@/jslib/common/src/abstractions/log.service";
import { StateService } from "@/jslib/common/src/abstractions/state.service";
@@ -14,7 +14,7 @@ export class WindowMain {
win: BrowserWindow;
isQuitting = false;
private windowStateChangeTimer: ReturnType<typeof setTimeout>;
private windowStateChangeTimer: NodeJS.Timeout;
private windowStates: { [key: string]: any } = {};
private enableAlwaysOnTop = false;
@@ -37,6 +37,7 @@ export class WindowMain {
app.quit();
return;
} else {
// eslint-disable-next-line
app.on("second-instance", (event, argv, workingDirectory) => {
// Someone tried to run a second instance, we should focus our window.
if (this.win != null) {
@@ -240,7 +241,7 @@ export class WindowMain {
const state = await this.stateService.getWindow();
const isValid = state != null && (this.stateHasBounds(state) || state.isMaximized);
let displayBounds: Rectangle = null;
let displayBounds: Electron.Rectangle = null;
if (!isValid) {
state.width = defaultWidth;
state.height = defaultHeight;

View File

@@ -94,7 +94,7 @@ describe("NodeCrypto Function Service", () => {
it("should fail with prk too small", async () => {
const cryptoFunctionService = new NodeCryptoFunctionService();
const f = cryptoFunctionService.hkdfExpand(
Utils.fromB64ToArray(prk16Byte).buffer,
Utils.fromB64ToArray(prk16Byte),
"info",
32,
"sha256",
@@ -105,7 +105,7 @@ describe("NodeCrypto Function Service", () => {
it("should fail with outputByteSize is too large", async () => {
const cryptoFunctionService = new NodeCryptoFunctionService();
const f = cryptoFunctionService.hkdfExpand(
Utils.fromB64ToArray(prk32Byte).buffer,
Utils.fromB64ToArray(prk32Byte),
"info",
8161,
"sha256",
@@ -341,7 +341,7 @@ function testHkdf(
utf8Key: string,
unicodeKey: string,
) {
const ikm = Utils.fromB64ToArray("criAmKtfzxanbgea5/kelQ==").buffer;
const ikm = Utils.fromB64ToArray("criAmKtfzxanbgea5/kelQ==");
const regularSalt = "salt";
const utf8Salt = "üser_salt";
@@ -393,7 +393,7 @@ function testHkdfExpand(
it("should create valid " + algorithm + " " + outputByteSize + " byte okm", async () => {
const cryptoFunctionService = new NodeCryptoFunctionService();
const okm = await cryptoFunctionService.hkdfExpand(
Utils.fromB64ToArray(b64prk).buffer,
Utils.fromB64ToArray(b64prk),
info,
outputByteSize,
algorithm,

View File

@@ -1,6 +1,6 @@
import { Jsonify } from "type-fest";
import { GroupEntry } from "@/src/models/groupEntry";
import { GroupEntry } from "../src/models/groupEntry";
// These must match the ldap server seed data in directory.ldif
const data: Jsonify<GroupEntry>[] = [
@@ -35,29 +35,6 @@ const data: Jsonify<GroupEntry>[] = [
externalId: "cn=Cleaners,ou=Janitorial,dc=bitwarden,dc=com",
name: "Cleaners",
},
{
userMemberExternalIds: [
"cn=Painterson Miki,ou=Product Development,dc=bitwarden,dc=com",
"cn=Virgina Pichocki,ou=Product Development,dc=bitwarden,dc=com",
"cn=Steffen Carsten,ou=Product Development,dc=bitwarden,dc=com",
],
groupMemberReferenceIds: [],
users: [],
referenceId: "cn=DevOps Team,dc=bitwarden,dc=com",
externalId: "cn=DevOps Team,dc=bitwarden,dc=com",
name: "DevOps Team",
},
{
userMemberExternalIds: [
"cn=Angus Merizzi,ou=Management,dc=bitwarden,dc=com",
"cn=Grissel Currer,ou=Management,dc=bitwarden,dc=com",
],
groupMemberReferenceIds: [],
users: [],
referenceId: "cn=Security Team,dc=bitwarden,dc=com",
externalId: "cn=Security Team,dc=bitwarden,dc=com",
name: "Security Team",
},
];
export const groupFixtures = data.map((g) => GroupEntry.fromJSON(g));

View File

@@ -688,27 +688,4 @@ mobile: +1 804 319-5569
pager: +1 804 815-3661
roomNumber: 9273
manager: cn=Inga Schnirer,ou=Product Testing,dc=bitwarden, dc=com
secretary: cn=Keven Gilleland,ou=Administrative,dc=bitwarden, dc=com
# DevOps Team and Security Team identify their members by the member uid attribute,
# instead of the member Dn attribute.
# These test that group membership by uid works correctly.
dn: cn=DevOps Team,dc=bitwarden,dc=com
changetype: add
cn: DevOps Team
gidnumber: 800
memberuid: mikip
memberuid: pichockv
memberuid: carstens
objectclass: posixGroup
objectclass: top
dn: cn=Security Team,dc=bitwarden,dc=com
changetype: add
cn: Security Team
gidnumber: 900
memberuid: merizzia
memberuid: currerg
objectclass: posixGroup
objectclass: top
secretary: cn=Keven Gilleland,ou=Administrative,dc=bitwarden, dc=com

10
openldap/mkcert.sh Executable file
View File

@@ -0,0 +1,10 @@
if ! [ -x "$(command -v mkcert)" ]; then
echo 'Error: mkcert is not installed. Install mkcert first and then re-run this script.'
echo 'e.g. brew install mkcert'
exit 1
fi
mkcert -install
mkdir -p ./openldap/certs
cp "$(mkcert -CAROOT)/rootCA.pem" ./openldap/certs/rootCA.pem
mkcert -key-file ./openldap/certs/openldap-key.pem -cert-file ./openldap/certs/openldap.pem localhost openldap

View File

@@ -1,6 +1,6 @@
import { Jsonify } from "type-fest";
import { UserEntry } from "@/src/models/userEntry";
import { UserEntry } from "../src/models/userEntry";
// These must match the ldap server seed data in directory.ldif
const data: Jsonify<UserEntry>[] = [

11901
package-lock.json generated

File diff suppressed because it is too large Load Diff

View File

@@ -2,7 +2,7 @@
"name": "@bitwarden/directory-connector",
"productName": "Bitwarden Directory Connector",
"description": "Sync your user directory to your Bitwarden organization.",
"version": "2025.12.0",
"version": "2025.9.0",
"keywords": [
"bitwarden",
"password",
@@ -31,14 +31,14 @@
"lint": "eslint . && prettier --check .",
"lint:fix": "eslint . --fix",
"build": "concurrently -n Main,Rend -c yellow,cyan \"npm run build:main\" \"npm run build:renderer\"",
"build:main": "webpack --config webpack.main.cjs",
"build:renderer": "webpack --config webpack.renderer.cjs",
"build:renderer:watch": "webpack --config webpack.renderer.cjs --watch",
"build:main": "webpack --config webpack.main.js",
"build:renderer": "webpack --config webpack.renderer.js",
"build:renderer:watch": "webpack --config webpack.renderer.js --watch",
"build:dist": "npm run reset && npm run rebuild && npm run build",
"build:cli": "webpack --config webpack.cli.cjs",
"build:cli:watch": "webpack --config webpack.cli.cjs --watch",
"build:cli:prod": "cross-env NODE_ENV=production webpack --config webpack.cli.cjs",
"build:cli:prod:watch": "cross-env NODE_ENV=production webpack --config webpack.cli.cjs --watch",
"build:cli": "webpack --config webpack.cli.js",
"build:cli:watch": "webpack --config webpack.cli.js --watch",
"build:cli:prod": "cross-env NODE_ENV=production webpack --config webpack.cli.js",
"build:cli:prod:watch": "cross-env NODE_ENV=production webpack --config webpack.cli.js --watch",
"electron": "npm run build:main && concurrently -k -n Main,Rend -c yellow,cyan \"electron --inspect=5858 ./build --watch\" \"npm run build:renderer:watch\"",
"electron:ignore": "npm run build:main && concurrently -k -n Main,Rend -c yellow,cyan \"electron --inspect=5858 --ignore-certificate-errors ./build --watch\" \"npm run build:renderer:watch\"",
"clean:dist": "rimraf --glob ./dist/*",
@@ -69,30 +69,29 @@
"test:watch:all": "jest --watchAll --testPathIgnorePatterns=.integration.spec.ts",
"test:integration": "jest .integration.spec.ts",
"test:integration:watch": "jest .integration.spec.ts --watch",
"test:integration:setup": "sh ./utils/openldap/mkcert.sh && docker compose up -d",
"test:integration:setup": "sh ./openldap/mkcert.sh && docker compose up -d",
"test:types": "npx tsc --noEmit"
},
"devDependencies": {
"@angular-devkit/build-angular": "20.3.3",
"@angular-eslint/eslint-plugin-template": "20.7.0",
"@angular-eslint/template-parser": "20.7.0",
"@angular/compiler-cli": "20.3.15",
"@angular-devkit/build-angular": "19.2.15",
"@angular-eslint/eslint-plugin-template": "19.8.0",
"@angular-eslint/template-parser": "19.8.0",
"@angular/compiler-cli": "19.2.14",
"@electron/notarize": "2.5.0",
"@electron/rebuild": "4.0.1",
"@fluffy-spoon/substitute": "1.208.0",
"@microsoft/microsoft-graph-types": "2.43.1",
"@ngtools/webpack": "20.3.3",
"@microsoft/microsoft-graph-types": "2.40.0",
"@ngtools/webpack": "19.2.14",
"@types/inquirer": "8.2.10",
"@types/jest": "29.5.14",
"@types/lowdb": "1.0.15",
"@types/node": "22.19.2",
"@types/node": "22.18.1",
"@types/node-fetch": "2.6.12",
"@types/node-forge": "1.3.11",
"@types/proper-lockfile": "4.1.4",
"@types/semver": "7.7.1",
"@types/tldjs": "2.3.4",
"@typescript-eslint/eslint-plugin": "8.50.0",
"@typescript-eslint/parser": "8.50.0",
"@typescript-eslint/eslint-plugin": "8.43.0",
"@typescript-eslint/parser": "8.43.0",
"@yao-pkg/pkg": "5.16.1",
"clean-webpack-plugin": "4.0.0",
"concurrently": "9.2.0",
@@ -100,73 +99,75 @@
"cross-env": "7.0.3",
"css-loader": "7.1.2",
"dotenv": "17.2.0",
"electron": "39.2.1",
"electron": "38.1.0",
"electron-builder": "24.13.3",
"electron-log": "5.4.1",
"electron-reload": "2.0.0-alpha.1",
"electron-store": "8.2.0",
"electron-updater": "6.6.2",
"eslint": "9.39.1",
"eslint": "8.57.1",
"eslint-config-prettier": "10.1.5",
"eslint-import-resolver-typescript": "4.4.4",
"eslint-plugin-import": "2.32.0",
"eslint-plugin-rxjs-angular-x": "0.1.0",
"eslint-plugin-rxjs-x": "0.8.3",
"eslint-plugin-rxjs": "5.0.3",
"eslint-plugin-rxjs-angular": "2.0.1",
"form-data": "4.0.4",
"glob": "13.0.0",
"html-loader": "5.1.0",
"html-webpack-plugin": "5.6.3",
"husky": "9.1.7",
"jest": "29.7.0",
"jest-junit": "16.0.0",
"jest-mock-extended": "4.0.0",
"jest-mock-extended": "3.0.7",
"jest-preset-angular": "14.6.0",
"lint-staged": "16.2.6",
"lint-staged": "16.1.2",
"mini-css-extract-plugin": "2.9.2",
"minimatch": "5.1.2",
"node-forge": "1.3.2",
"node-abi": "3.77.0",
"node-forge": "1.3.1",
"node-loader": "2.1.0",
"prettier": "3.7.4",
"rimraf": "6.1.0",
"prettier": "3.6.2",
"rimraf": "6.0.1",
"rxjs": "7.8.2",
"sass": "1.97.1",
"sass": "1.92.1",
"sass-loader": "16.0.5",
"ts-jest": "29.4.1",
"ts-loader": "9.5.2",
"tsconfig-paths-webpack-plugin": "4.2.0",
"type-fest": "5.3.0",
"typescript": "5.9.3",
"webpack": "5.104.1",
"type-fest": "4.41.0",
"typescript": "5.8.3",
"webpack": "5.101.0",
"webpack-cli": "6.0.1",
"webpack-merge": "6.0.1",
"webpack-node-externals": "3.0.0",
"zone.js": "0.15.1"
},
"dependencies": {
"@angular/animations": "20.3.15",
"@angular/cdk": "20.2.14",
"@angular/cli": "20.3.3",
"@angular/common": "20.3.15",
"@angular/compiler": "20.3.15",
"@angular/core": "20.3.15",
"@angular/forms": "20.3.15",
"@angular/platform-browser": "20.3.15",
"@angular/platform-browser-dynamic": "20.3.15",
"@angular/router": "20.3.15",
"@angular/animations": "19.2.14",
"@angular/cdk": "19.2.14",
"@angular/cli": "19.2.14",
"@angular/common": "19.2.14",
"@angular/compiler": "19.2.14",
"@angular/core": "19.2.14",
"@angular/forms": "19.2.14",
"@angular/platform-browser": "19.2.14",
"@angular/platform-browser-dynamic": "19.2.14",
"@angular/router": "19.2.14",
"@microsoft/microsoft-graph-client": "3.0.7",
"big-integer": "1.6.52",
"bootstrap": "5.3.7",
"browser-hrtime": "1.1.8",
"chalk": "4.1.2",
"commander": "14.0.0",
"core-js": "3.44.0",
"form-data": "4.0.4",
"googleapis": "149.0.0",
"google-auth-library": "10.3.0",
"googleapis": "153.0.0",
"googleapis-common": "8.0.0",
"https-proxy-agent": "7.0.6",
"inquirer": "8.2.6",
"keytar": "7.9.0",
"ldapts": "8.0.1",
"lowdb": "1.0.0",
"ngx-toastr": "19.1.0",
"ngx-toastr": "19.0.0",
"node-fetch": "2.7.0",
"parse5": "8.0.0",
"proper-lockfile": "4.1.2",

View File

@@ -1,5 +1,5 @@
import { DirectoryType } from "@/src/enums/directoryType";
import { IDirectoryService } from "@/src/services/directory-services/directory.service";
import { IDirectoryService } from "@/src/services/directory.service";
export abstract class DirectoryFactoryService {
abstract createService(type: DirectoryType): IDirectoryService;

View File

@@ -23,7 +23,7 @@ import { EnvironmentComponent } from "./environment.component";
// The only subscription in this component is closed from a child component, confusing eslint.
// https://github.com/cartant/eslint-plugin-rxjs-angular/blob/main/docs/rules/prefer-takeuntil.md
//
// eslint-disable-next-line rxjs-angular-x/prefer-takeuntil
// eslint-disable-next-line rxjs-angular/prefer-takeuntil
export class ApiKeyComponent {
@ViewChild("environment", { read: ViewContainerRef, static: true })
environmentModal: ViewContainerRef;
@@ -100,7 +100,7 @@ export class ApiKeyComponent {
this.environmentModal,
);
// eslint-disable-next-line rxjs-angular-x/prefer-takeuntil
// eslint-disable-next-line rxjs-angular/prefer-takeuntil
childComponent.onSaved.pipe(takeUntil(modalRef.onClosed)).subscribe(() => {
modalRef.close();
});

View File

@@ -1,3 +1,6 @@
// core-js is required for bwdc cli which appears to require these pollyfills for dynamic imports
// see https://github.com/bitwarden/directory-connector/issues/878
import "core-js/stable";
import "zone.js";
import { NgModule } from "@angular/core";

View File

@@ -3,7 +3,8 @@ import { platformBrowserDynamic } from "@angular/platform-browser-dynamic";
import { isDev } from "@/jslib/electron/src/utils";
import "../scss/styles.scss";
// tslint:disable-next-line
require("../scss/styles.scss");
import { AppModule } from "./app.module";

View File

@@ -768,8 +768,5 @@
},
"launchWebVault": {
"message": "Launch Web Vault"
},
"authenticationFailed": {
"message": "Authentication failed"
}
}

View File

@@ -9,7 +9,7 @@ import { MenuMain } from "./menu.main";
const SyncCheckInterval = 60 * 1000; // 1 minute
export class MessagingMain {
private syncTimeout: ReturnType<typeof setTimeout>;
private syncTimeout: NodeJS.Timeout;
constructor(
private windowMain: WindowMain,

View File

@@ -2,8 +2,8 @@ import { GetUniqueString } from "@/jslib/common/spec/utils";
import { UserEntry } from "@/src/models/userEntry";
import { groupSimulator, userSimulator } from "../../utils/request-builder-helper";
import { RequestBuilderOptions } from "../abstractions/request-builder.service";
import { groupSimulator, userSimulator } from "../utils/request-builder-helper";
import { BatchRequestBuilder } from "./batch-request-builder";

View File

@@ -5,11 +5,11 @@ import { DirectoryFactoryService } from "../abstractions/directory-factory.servi
import { StateService } from "../abstractions/state.service";
import { DirectoryType } from "../enums/directoryType";
import { EntraIdDirectoryService } from "./directory-services/entra-id-directory.service";
import { GSuiteDirectoryService } from "./directory-services/gsuite-directory.service";
import { LdapDirectoryService } from "./directory-services/ldap-directory.service";
import { OktaDirectoryService } from "./directory-services/okta-directory.service";
import { OneLoginDirectoryService } from "./directory-services/onelogin-directory.service";
import { EntraIdDirectoryService } from "./entra-id-directory.service";
import { GSuiteDirectoryService } from "./gsuite-directory.service";
import { LdapDirectoryService } from "./ldap-directory.service";
import { OktaDirectoryService } from "./okta-directory.service";
import { OneLoginDirectoryService } from "./onelogin-directory.service";
export class DefaultDirectoryFactoryService implements DirectoryFactoryService {
constructor(

View File

@@ -1,270 +0,0 @@
import { config as dotenvConfig } from "dotenv";
import { mock, MockProxy } from "jest-mock-extended";
import { I18nService } from "../../../jslib/common/src/abstractions/i18n.service";
import { LogService } from "../../../jslib/common/src/abstractions/log.service";
import {
getGSuiteConfiguration,
getSyncConfiguration,
} from "../../../utils/google-workspace/config-fixtures";
import { groupFixtures } from "../../../utils/google-workspace/group-fixtures";
import { userFixtures } from "../../../utils/google-workspace/user-fixtures";
import { DirectoryType } from "../../enums/directoryType";
import { StateService } from "../state.service";
import { GSuiteDirectoryService } from "./gsuite-directory.service";
// These tests integrate with a test Google Workspace instance.
// Credentials are located in the shared Bitwarden collection for Directory Connector testing.
// Place the .env file attachment in the utils folder.
// Load .env variables
dotenvConfig({ path: "utils/.env" });
// These filters target integration test data.
// These should return data that matches the user and group fixtures exactly.
// There may be additional data present if not used.
const INTEGRATION_USER_FILTER = "|orgUnitPath='/Integration testing'";
const INTEGRATION_GROUP_FILTER = "|name:Integration*";
// These tests are slow!
// Increase the default timeout from 5s to 15s
jest.setTimeout(15000);
describe("gsuiteDirectoryService", () => {
let logService: MockProxy<LogService>;
let i18nService: MockProxy<I18nService>;
let stateService: MockProxy<StateService>;
let directoryService: GSuiteDirectoryService;
beforeEach(() => {
logService = mock();
i18nService = mock();
stateService = mock();
stateService.getDirectoryType.mockResolvedValue(DirectoryType.GSuite);
stateService.getLastUserSync.mockResolvedValue(null); // do not filter results by last modified date
i18nService.t.mockImplementation((id) => id); // passthrough implementation for any error messages
directoryService = new GSuiteDirectoryService(logService, i18nService, stateService);
});
describe("basic sync fetching users and groups", () => {
it("syncs without using filters (includes test data)", async () => {
const directoryConfig = getGSuiteConfiguration();
stateService.getDirectory.calledWith(DirectoryType.GSuite).mockResolvedValue(directoryConfig);
const syncConfig = getSyncConfiguration({
groups: true,
users: true,
});
stateService.getSync.mockResolvedValue(syncConfig);
const result = await directoryService.getEntries(true, true);
expect(result[0]).toEqual(expect.arrayContaining(groupFixtures));
expect(result[1]).toEqual(expect.arrayContaining(userFixtures));
});
it("syncs using user and group filters (exact match for test data)", async () => {
const directoryConfig = getGSuiteConfiguration();
stateService.getDirectory.calledWith(DirectoryType.GSuite).mockResolvedValue(directoryConfig);
const syncConfig = getSyncConfiguration({
groups: true,
users: true,
userFilter: INTEGRATION_USER_FILTER,
groupFilter: INTEGRATION_GROUP_FILTER,
});
stateService.getSync.mockResolvedValue(syncConfig);
const result = await directoryService.getEntries(true, true);
expect(result).toEqual([groupFixtures, userFixtures]);
});
it("syncs only users when groups sync is disabled", async () => {
const directoryConfig = getGSuiteConfiguration();
stateService.getDirectory.calledWith(DirectoryType.GSuite).mockResolvedValue(directoryConfig);
const syncConfig = getSyncConfiguration({
groups: false,
users: true,
userFilter: INTEGRATION_USER_FILTER,
});
stateService.getSync.mockResolvedValue(syncConfig);
const result = await directoryService.getEntries(true, true);
expect(result[0]).toBeUndefined();
expect(result[1]).toEqual(expect.arrayContaining(userFixtures));
});
it("syncs only groups when users sync is disabled", async () => {
const directoryConfig = getGSuiteConfiguration();
stateService.getDirectory.calledWith(DirectoryType.GSuite).mockResolvedValue(directoryConfig);
const syncConfig = getSyncConfiguration({
groups: true,
users: false,
groupFilter: INTEGRATION_GROUP_FILTER,
});
stateService.getSync.mockResolvedValue(syncConfig);
const result = await directoryService.getEntries(true, true);
expect(result[0]).toEqual(expect.arrayContaining(groupFixtures));
expect(result[1]).toEqual([]);
});
});
describe("users", () => {
it("includes disabled users in sync results", async () => {
const directoryConfig = getGSuiteConfiguration();
stateService.getDirectory.calledWith(DirectoryType.GSuite).mockResolvedValue(directoryConfig);
const syncConfig = getSyncConfiguration({
users: true,
userFilter: INTEGRATION_USER_FILTER,
});
stateService.getSync.mockResolvedValue(syncConfig);
const result = await directoryService.getEntries(true, true);
const disabledUser = userFixtures.find((u) => u.email === "testuser5@bwrox.dev");
expect(result[1]).toContainEqual(disabledUser);
expect(disabledUser.disabled).toBe(true);
});
it("filters users by org unit path", async () => {
const directoryConfig = getGSuiteConfiguration();
stateService.getDirectory.calledWith(DirectoryType.GSuite).mockResolvedValue(directoryConfig);
const syncConfig = getSyncConfiguration({
users: true,
userFilter: INTEGRATION_USER_FILTER,
});
stateService.getSync.mockResolvedValue(syncConfig);
const result = await directoryService.getEntries(true, true);
expect(result[1]).toEqual(userFixtures);
expect(result[1].length).toBe(5);
});
it("filters users by email pattern", async () => {
const directoryConfig = getGSuiteConfiguration();
stateService.getDirectory.calledWith(DirectoryType.GSuite).mockResolvedValue(directoryConfig);
const syncConfig = getSyncConfiguration({
users: true,
userFilter: "|email:testuser1*",
});
stateService.getSync.mockResolvedValue(syncConfig);
const result = await directoryService.getEntries(true, true);
const testuser1 = userFixtures.find((u) => u.email === "testuser1@bwrox.dev");
expect(result[1]).toContainEqual(testuser1);
expect(result[1].length).toBeGreaterThanOrEqual(1);
});
});
describe("groups", () => {
it("filters groups by name pattern", async () => {
const directoryConfig = getGSuiteConfiguration();
stateService.getDirectory.calledWith(DirectoryType.GSuite).mockResolvedValue(directoryConfig);
const syncConfig = getSyncConfiguration({
groups: true,
users: true,
userFilter: INTEGRATION_USER_FILTER,
groupFilter: INTEGRATION_GROUP_FILTER,
});
stateService.getSync.mockResolvedValue(syncConfig);
const result = await directoryService.getEntries(true, true);
expect(result[0]).toEqual(groupFixtures);
expect(result[0].length).toBe(2);
});
it("includes group members correctly", async () => {
const directoryConfig = getGSuiteConfiguration();
stateService.getDirectory.calledWith(DirectoryType.GSuite).mockResolvedValue(directoryConfig);
const syncConfig = getSyncConfiguration({
groups: true,
users: true,
userFilter: INTEGRATION_USER_FILTER,
groupFilter: INTEGRATION_GROUP_FILTER,
});
stateService.getSync.mockResolvedValue(syncConfig);
const result = await directoryService.getEntries(true, true);
const groupA = result[0].find((g) => g.name === "Integration Test Group A");
expect(groupA).toBeDefined();
expect(groupA.userMemberExternalIds.size).toBe(2);
expect(groupA.userMemberExternalIds.has("111605910541641314041")).toBe(true);
expect(groupA.userMemberExternalIds.has("111147009830456099026")).toBe(true);
const groupB = result[0].find((g) => g.name === "Integration Test Group B");
expect(groupB).toBeDefined();
expect(groupB.userMemberExternalIds.size).toBe(2);
expect(groupB.userMemberExternalIds.has("111147009830456099026")).toBe(true);
expect(groupB.userMemberExternalIds.has("100150970267699397306")).toBe(true);
});
it("handles groups with no members", async () => {
const directoryConfig = getGSuiteConfiguration();
stateService.getDirectory.calledWith(DirectoryType.GSuite).mockResolvedValue(directoryConfig);
const syncConfig = getSyncConfiguration({
groups: true,
users: true,
userFilter: INTEGRATION_USER_FILTER,
groupFilter: "|name:Integration*",
});
stateService.getSync.mockResolvedValue(syncConfig);
const result = await directoryService.getEntries(true, true);
// All test groups should have members, but ensure the code handles empty groups
expect(result[0]).toBeDefined();
expect(Array.isArray(result[0])).toBe(true);
});
});
describe("error handling", () => {
it("throws error when directory configuration is incomplete", async () => {
stateService.getDirectory.calledWith(DirectoryType.GSuite).mockResolvedValue(
getGSuiteConfiguration({
clientEmail: "",
}),
);
const syncConfig = getSyncConfiguration({
users: true,
});
stateService.getSync.mockResolvedValue(syncConfig);
await expect(directoryService.getEntries(true, true)).rejects.toThrow();
});
it("throws error when authentication fails with invalid credentials", async () => {
const directoryConfig = getGSuiteConfiguration({
privateKey: "-----BEGIN PRIVATE KEY-----\nINVALID_KEY\n-----END PRIVATE KEY-----\n",
});
stateService.getDirectory.calledWith(DirectoryType.GSuite).mockResolvedValue(directoryConfig);
const syncConfig = getSyncConfiguration({
users: true,
});
stateService.getSync.mockResolvedValue(syncConfig);
await expect(directoryService.getEntries(true, true)).rejects.toThrow();
});
});
});

View File

@@ -1,5 +1,5 @@
import { GroupEntry } from "../../models/groupEntry";
import { UserEntry } from "../../models/userEntry";
import { GroupEntry } from "../models/groupEntry";
import { UserEntry } from "../models/userEntry";
export interface IDirectoryService {
getEntries(force: boolean, test: boolean): Promise<[GroupEntry[], UserEntry[]]>;

View File

@@ -7,14 +7,14 @@ import * as graphType from "@microsoft/microsoft-graph-types";
import { I18nService } from "@/jslib/common/src/abstractions/i18n.service";
import { LogService } from "@/jslib/common/src/abstractions/log.service";
import { StateService } from "../../abstractions/state.service";
import { DirectoryType } from "../../enums/directoryType";
import { EntraIdConfiguration } from "../../models/entraIdConfiguration";
import { GroupEntry } from "../../models/groupEntry";
import { SyncConfiguration } from "../../models/syncConfiguration";
import { UserEntry } from "../../models/userEntry";
import { BaseDirectoryService } from "../baseDirectory.service";
import { StateService } from "../abstractions/state.service";
import { DirectoryType } from "../enums/directoryType";
import { EntraIdConfiguration } from "../models/entraIdConfiguration";
import { GroupEntry } from "../models/groupEntry";
import { SyncConfiguration } from "../models/syncConfiguration";
import { UserEntry } from "../models/userEntry";
import { BaseDirectoryService } from "./baseDirectory.service";
import { IDirectoryService } from "./directory.service";
const EntraIdPublicIdentityAuthority = "login.microsoftonline.com";
@@ -132,7 +132,7 @@ export class EntraIdDirectoryService extends BaseDirectoryService implements IDi
}
const setFilter = this.createCustomUserSet(this.syncConfig.userFilter);
// eslint-disable-next-line
while (true) {
const users: graphType.User[] = res.value;
if (users != null) {
@@ -211,7 +211,7 @@ export class EntraIdDirectoryService extends BaseDirectoryService implements IDi
let auMembers = await this.client
.api(`${this.getGraphApiEndpoint()}/v1.0/directory/administrativeUnits/${p}/members`)
.get();
// eslint-disable-next-line
while (true) {
for (const auMember of auMembers.value) {
const groupId = auMember.id;
@@ -328,7 +328,7 @@ export class EntraIdDirectoryService extends BaseDirectoryService implements IDi
const entries: GroupEntry[] = [];
const groupsReq = this.client.api("/groups");
let res = await groupsReq.get();
// eslint-disable-next-line
while (true) {
const groups: graphType.Group[] = res.value;
if (groups != null) {
@@ -421,7 +421,7 @@ export class EntraIdDirectoryService extends BaseDirectoryService implements IDi
const memReq = this.client.api("/groups/" + group.id + "/members");
let memRes = await memReq.get();
// eslint-disable-next-line
while (true) {
const members: any = memRes.value;
if (members != null) {

View File

@@ -4,32 +4,16 @@ import { admin_directory_v1, google } from "googleapis";
import { I18nService } from "@/jslib/common/src/abstractions/i18n.service";
import { LogService } from "@/jslib/common/src/abstractions/log.service";
import { StateService } from "../../abstractions/state.service";
import { DirectoryType } from "../../enums/directoryType";
import { GroupEntry } from "../../models/groupEntry";
import { GSuiteConfiguration } from "../../models/gsuiteConfiguration";
import { SyncConfiguration } from "../../models/syncConfiguration";
import { UserEntry } from "../../models/userEntry";
import { BaseDirectoryService } from "../baseDirectory.service";
import { StateService } from "../abstractions/state.service";
import { DirectoryType } from "../enums/directoryType";
import { GroupEntry } from "../models/groupEntry";
import { GSuiteConfiguration } from "../models/gsuiteConfiguration";
import { SyncConfiguration } from "../models/syncConfiguration";
import { UserEntry } from "../models/userEntry";
import { BaseDirectoryService } from "./baseDirectory.service";
import { IDirectoryService } from "./directory.service";
/**
* Google Workspace (formerly G Suite) Directory Service
*
* This service integrates with Google Workspace to synchronize users and groups
* to Bitwarden organizations using the Google Admin SDK Directory API.
*
* @remarks
* Authentication is performed using a service account with domain-wide delegation.
* The service account must be granted the following OAuth 2.0 scopes:
* - https://www.googleapis.com/auth/admin.directory.user.readonly
* - https://www.googleapis.com/auth/admin.directory.group.readonly
* - https://www.googleapis.com/auth/admin.directory.group.member.readonly
*
* @see {@link https://developers.google.com/admin-sdk/directory/v1/guides | Google Admin SDK Directory API}
* @see {@link https://support.google.com/a/answer/162106 | Domain-wide delegation of authority}
*/
export class GSuiteDirectoryService extends BaseDirectoryService implements IDirectoryService {
private client: JWT;
private service: admin_directory_v1.Admin;
@@ -46,29 +30,6 @@ export class GSuiteDirectoryService extends BaseDirectoryService implements IDir
this.service = google.admin("directory_v1");
}
/**
* Retrieves users and groups from Google Workspace directory
* @returns A tuple containing [groups, users] arrays
*
* @remarks
* This function:
* 1. Validates the directory type matches GSuite
* 2. Loads directory and sync configuration
* 3. Authenticates with Google Workspace using service account credentials
* 4. Retrieves users (if enabled in sync config)
* 5. Retrieves groups and their members (if enabled in sync config)
* 6. Applies any user/group filters specified in sync configuration
*
* User and group filters follow Google Workspace Directory API query syntax:
* - Use `|` prefix for custom filters (e.g., "|orgUnitPath='/Engineering'")
* - Multiple conditions can be combined with AND/OR operators
*
* @example
* ```typescript
* const [groups, users] = await service.getEntries(true, false);
* console.log(`Synced ${users.length} users and ${groups.length} groups`);
* ```
*/
async getEntries(force: boolean, test: boolean): Promise<[GroupEntry[], UserEntry[]]> {
const type = await this.stateService.getDirectoryType();
if (type !== DirectoryType.GSuite) {
@@ -104,33 +65,13 @@ export class GSuiteDirectoryService extends BaseDirectoryService implements IDir
return [groups, users];
}
/**
* Retrieves all users from Google Workspace directory
*
* @returns Array of UserEntry objects representing users in the directory
*
* @remarks
* This method performs two separate queries:
* 1. Active users (including suspended and archived)
* 2. Deleted users (marked with deleted flag)
*
* The method handles pagination automatically, fetching all pages of results.
* Users are filtered based on the userFilter specified in sync configuration.
*
* User properties mapped:
* - referenceId: User's unique Google ID
* - externalId: User's unique Google ID (same as referenceId)
* - email: User's primary email address (lowercase)
* - disabled: True if user is suspended or archived
* - deleted: True if user is deleted from the directory
*/
private async getUsers(): Promise<UserEntry[]> {
const entries: UserEntry[] = [];
const query = this.createDirectoryQuery(this.syncConfig.userFilter);
let nextPageToken: string = null;
const filter = this.createCustomSet(this.syncConfig.userFilter);
// eslint-disable-next-line
while (true) {
this.logService.info("Querying users - nextPageToken:" + nextPageToken);
const p = Object.assign({ query: query, pageToken: nextPageToken }, this.authParams);
@@ -158,7 +99,7 @@ export class GSuiteDirectoryService extends BaseDirectoryService implements IDir
}
nextPageToken = null;
// eslint-disable-next-line
while (true) {
this.logService.info("Querying deleted users - nextPageToken:" + nextPageToken);
const p = Object.assign(
@@ -191,13 +132,6 @@ export class GSuiteDirectoryService extends BaseDirectoryService implements IDir
return entries;
}
/**
* Transforms a Google Workspace user object into a UserEntry
*
* @param user - Google Workspace user object from the API
* @param deleted - Whether this user is from the deleted users list
* @returns UserEntry object or null if user data is invalid
*/
private buildUser(user: admin_directory_v1.Schema$User, deleted: boolean) {
if ((user.emails == null || user.emails === "") && !deleted) {
return null;
@@ -212,17 +146,6 @@ export class GSuiteDirectoryService extends BaseDirectoryService implements IDir
return entry;
}
/**
* Retrieves all groups from Google Workspace directory
*
* @param setFilter - Tuple of [isWhitelist, Set<string>] for filtering groups
* @param users - Array of UserEntry objects to reference when processing members
* @returns Array of GroupEntry objects representing groups in the directory
*
* @remarks
* For each group, the method also retrieves all group members by calling the
* members API. Groups are filtered based on the groupFilter in sync configuration.
*/
private async getGroups(
setFilter: [boolean, Set<string>],
users: UserEntry[],
@@ -231,6 +154,7 @@ export class GSuiteDirectoryService extends BaseDirectoryService implements IDir
const query = this.createDirectoryQuery(this.syncConfig.groupFilter);
let nextPageToken: string = null;
// eslint-disable-next-line
while (true) {
this.logService.info("Querying groups - nextPageToken:" + nextPageToken);
let p = null;
@@ -262,19 +186,6 @@ export class GSuiteDirectoryService extends BaseDirectoryService implements IDir
return entries;
}
/**
* Transforms a Google Workspace group object into a GroupEntry with members
*
* @param group - Google Workspace group object from the API
* @param users - Array of UserEntry objects for reference
* @returns GroupEntry object with all members populated
*
* @remarks
* This method retrieves all members of the group, handling three member types:
* - USER: Individual user members (only active status users are included)
* - GROUP: Nested group members
* - CUSTOMER: Special type that includes all users in the domain
*/
private async buildGroup(group: admin_directory_v1.Schema$Group, users: UserEntry[]) {
let nextPageToken: string = null;
@@ -283,6 +194,7 @@ export class GSuiteDirectoryService extends BaseDirectoryService implements IDir
entry.externalId = group.id;
entry.name = group.name;
// eslint-disable-next-line
while (true) {
const p = Object.assign({ groupKey: group.id, pageToken: nextPageToken }, this.authParams);
const memRes = await this.service.members.list(p);
@@ -320,26 +232,6 @@ export class GSuiteDirectoryService extends BaseDirectoryService implements IDir
return entry;
}
/**
* Authenticates with Google Workspace using service account credentials
*
* @throws Error if required configuration fields are missing or authentication fails
*
* @remarks
* Authentication uses a JWT with the following required fields:
* - clientEmail: Service account email address
* - privateKey: Service account private key (PEM format)
* - subject: Admin user email to impersonate (for domain-wide delegation)
*
* The service account must be configured with domain-wide delegation and granted
* the required OAuth scopes in the Google Workspace Admin Console.
*
* Optional configuration:
* - domain: Filters results to a specific domain
* - customer: Customer ID for multi-domain organizations
*
* @see {@link https://developers.google.com/identity/protocols/oauth2/service-account | Service account authentication}
*/
private async auth() {
if (
this.dirConfig.clientEmail == null ||
@@ -361,15 +253,7 @@ export class GSuiteDirectoryService extends BaseDirectoryService implements IDir
],
});
try {
await this.client.authorize();
} catch (error) {
// Catch and rethrow this to sanitize any sensitive info (e.g. private key) in the error message
this.logService.error(
`Google Workspace authentication failed: ${error?.name || "Unknown error"}`,
);
throw new Error(this.i18nService.t("authenticationFailed"));
}
await this.client.authorize();
this.authParams = {
auth: this.client,

View File

@@ -1,17 +1,14 @@
import { mock, MockProxy } from "jest-mock-extended";
import { I18nService } from "../../../jslib/common/src/abstractions/i18n.service";
import { LogService } from "../../../jslib/common/src/abstractions/log.service";
import {
getLdapConfiguration,
getSyncConfiguration,
} from "../../../utils/openldap/config-fixtures";
import { groupFixtures } from "../../../utils/openldap/group-fixtures";
import { userFixtures } from "../../../utils/openldap/user-fixtures";
import { DirectoryType } from "../../enums/directoryType";
import { StateService } from "../state.service";
import { I18nService } from "../../jslib/common/src/abstractions/i18n.service";
import { LogService } from "../../jslib/common/src/abstractions/log.service";
import { groupFixtures } from "../../openldap/group-fixtures";
import { userFixtures } from "../../openldap/user-fixtures";
import { DirectoryType } from "../enums/directoryType";
import { getLdapConfiguration, getSyncConfiguration } from "../utils/test-fixtures";
import { LdapDirectoryService } from "./ldap-directory.service";
import { StateService } from "./state.service";
// These tests integrate with the OpenLDAP docker image and seed data located in the openldap folder.
// To run theses tests:
@@ -55,7 +52,7 @@ describe("ldapDirectoryService", () => {
getLdapConfiguration({
ssl: true,
startTls: true,
tlsCaPath: "./utils/openldap/certs/rootCA.pem",
tlsCaPath: "./openldap/certs/rootCA.pem",
}),
);
stateService.getSync.mockResolvedValue(getSyncConfiguration({ groups: true, users: true }));
@@ -70,7 +67,7 @@ describe("ldapDirectoryService", () => {
getLdapConfiguration({
port: 1636,
ssl: true,
sslCaPath: "./utils/openldap/certs/rootCA.pem",
sslCaPath: "./openldap/certs/rootCA.pem",
}),
);
stateService.getSync.mockResolvedValue(getSyncConfiguration({ groups: true, users: true }));

View File

@@ -7,12 +7,12 @@ import { I18nService } from "@/jslib/common/src/abstractions/i18n.service";
import { LogService } from "@/jslib/common/src/abstractions/log.service";
import { Utils } from "@/jslib/common/src/misc/utils";
import { StateService } from "../../abstractions/state.service";
import { DirectoryType } from "../../enums/directoryType";
import { GroupEntry } from "../../models/groupEntry";
import { LdapConfiguration } from "../../models/ldapConfiguration";
import { SyncConfiguration } from "../../models/syncConfiguration";
import { UserEntry } from "../../models/userEntry";
import { StateService } from "../abstractions/state.service";
import { DirectoryType } from "../enums/directoryType";
import { GroupEntry } from "../models/groupEntry";
import { LdapConfiguration } from "../models/ldapConfiguration";
import { SyncConfiguration } from "../models/syncConfiguration";
import { UserEntry } from "../models/userEntry";
import { IDirectoryService } from "./directory.service";
@@ -118,7 +118,7 @@ export class LdapDirectoryService implements IDirectoryService {
[delControl],
);
return regularUsers.concat(deletedUsers);
} catch {
} catch (e) {
this.logService.warning("Cannot query deleted users.");
return regularUsers;
}
@@ -192,21 +192,14 @@ export class LdapDirectoryService implements IDirectoryService {
this.syncConfig.userFilter,
);
const userPath = this.makeSearchPath(this.syncConfig.userPath);
const userDnMap = new Map<string, string>();
const userUidMap = new Map<string, string>();
const userIdMap = new Map<string, string>();
await this.search<string>(userPath, userFilter, (se: any) => {
const dn = this.getReferenceId(se);
const uid = this.getAttr<string>(se, "uid");
const externalId = this.getExternalId(se, dn);
userDnMap.set(dn, externalId);
if (uid != null) {
userUidMap.set(uid.toLowerCase(), externalId);
}
userIdMap.set(this.getReferenceId(se), this.getExternalId(se, this.getReferenceId(se)));
return se;
});
for (const se of groupSearchEntries) {
const group = this.buildGroup(se, userDnMap, userUidMap);
const group = this.buildGroup(se, userIdMap);
if (group != null) {
entries.push(group);
}
@@ -215,20 +208,7 @@ export class LdapDirectoryService implements IDirectoryService {
return entries;
}
/**
* Builds a GroupEntry from LDAP search results, including membership.
* Supports user membership by DN or UID and nested group membership by DN.
*
* @param searchEntry - The LDAP search entry containing group data
* @param userDnMap - Map of user DNs to their external IDs
* @param userUidMap - Map of user UIDs to their external IDs
* @returns A populated GroupEntry object, or null if the group lacks required properties
*/
private buildGroup(
searchEntry: any,
userDnMap: Map<string, string>,
userUidMap: Map<string, string>,
) {
private buildGroup(searchEntry: any, userMap: Map<string, string>) {
const group = new GroupEntry();
group.referenceId = this.getReferenceId(searchEntry);
if (group.referenceId == null) {
@@ -248,34 +228,11 @@ export class LdapDirectoryService implements IDirectoryService {
const members = this.getAttrVals<string>(searchEntry, this.syncConfig.memberAttribute);
if (members != null) {
// Parses a group member attribute and identifies it as a member DN, member Uid, or a group Dn
const getMemberAttributeType = (member: string): "memberDn" | "memberUid" | "groupDn" => {
const isDnLike = member.includes("=") && member.includes(",");
if (isDnLike) {
return userDnMap.has(member) ? "memberDn" : "groupDn";
}
return "memberUid";
};
for (const member of members) {
switch (getMemberAttributeType(member)) {
case "memberDn": {
const externalId = userDnMap.get(member);
if (externalId != null) {
group.userMemberExternalIds.add(externalId);
}
break;
}
case "memberUid": {
const externalId = userUidMap.get(member.toLowerCase());
if (externalId != null) {
group.userMemberExternalIds.add(externalId);
}
break;
}
case "groupDn":
group.groupMemberReferenceIds.add(member);
break;
for (const memDn of members) {
if (userMap.has(memDn) && !group.userMemberExternalIds.has(userMap.get(memDn))) {
group.userMemberExternalIds.add(userMap.get(memDn));
} else if (!group.groupMemberReferenceIds.has(memDn)) {
group.groupMemberReferenceIds.add(memDn);
}
}
}

View File

@@ -3,14 +3,14 @@ import * as https from "https";
import { I18nService } from "@/jslib/common/src/abstractions/i18n.service";
import { LogService } from "@/jslib/common/src/abstractions/log.service";
import { StateService } from "../../abstractions/state.service";
import { DirectoryType } from "../../enums/directoryType";
import { GroupEntry } from "../../models/groupEntry";
import { OktaConfiguration } from "../../models/oktaConfiguration";
import { SyncConfiguration } from "../../models/syncConfiguration";
import { UserEntry } from "../../models/userEntry";
import { BaseDirectoryService } from "../baseDirectory.service";
import { StateService } from "../abstractions/state.service";
import { DirectoryType } from "../enums/directoryType";
import { GroupEntry } from "../models/groupEntry";
import { OktaConfiguration } from "../models/oktaConfiguration";
import { SyncConfiguration } from "../models/syncConfiguration";
import { UserEntry } from "../models/userEntry";
import { BaseDirectoryService } from "./baseDirectory.service";
import { IDirectoryService } from "./directory.service";
const DelayBetweenBuildGroupCallsInMilliseconds = 500;

View File

@@ -1,14 +1,14 @@
import { I18nService } from "@/jslib/common/src/abstractions/i18n.service";
import { LogService } from "@/jslib/common/src/abstractions/log.service";
import { StateService } from "../../abstractions/state.service";
import { DirectoryType } from "../../enums/directoryType";
import { GroupEntry } from "../../models/groupEntry";
import { OneLoginConfiguration } from "../../models/oneLoginConfiguration";
import { SyncConfiguration } from "../../models/syncConfiguration";
import { UserEntry } from "../../models/userEntry";
import { BaseDirectoryService } from "../baseDirectory.service";
import { StateService } from "../abstractions/state.service";
import { DirectoryType } from "../enums/directoryType";
import { GroupEntry } from "../models/groupEntry";
import { OneLoginConfiguration } from "../models/oneLoginConfiguration";
import { SyncConfiguration } from "../models/syncConfiguration";
import { UserEntry } from "../models/userEntry";
import { BaseDirectoryService } from "./baseDirectory.service";
import { IDirectoryService } from "./directory.service";
// Basic email validation: something@something.something

View File

@@ -2,8 +2,8 @@ import { GetUniqueString } from "@/jslib/common/spec/utils";
import { UserEntry } from "@/src/models/userEntry";
import { groupSimulator, userSimulator } from "../../utils/request-builder-helper";
import { RequestBuilderOptions } from "../abstractions/request-builder.service";
import { groupSimulator, userSimulator } from "../utils/request-builder-helper";
import { SingleRequestBuilder } from "./single-request-builder";

View File

@@ -7,20 +7,19 @@ import { EnvironmentService } from "@/jslib/common/src/services/environment.serv
import { I18nService } from "../../jslib/common/src/abstractions/i18n.service";
import { LogService } from "../../jslib/common/src/abstractions/log.service";
import { getLdapConfiguration, getSyncConfiguration } from "../../utils/openldap/config-fixtures";
import { groupFixtures } from "../../openldap/group-fixtures";
import { userFixtures } from "../../openldap/user-fixtures";
import { DirectoryFactoryService } from "../abstractions/directory-factory.service";
import { DirectoryType } from "../enums/directoryType";
import { getLdapConfiguration, getSyncConfiguration } from "../utils/test-fixtures";
import { BatchRequestBuilder } from "./batch-request-builder";
import { LdapDirectoryService } from "./directory-services/ldap-directory.service";
import { LdapDirectoryService } from "./ldap-directory.service";
import { SingleRequestBuilder } from "./single-request-builder";
import { StateService } from "./state.service";
import { SyncService } from "./sync.service";
import * as constants from "./sync.service";
import { groupFixtures } from "@/utils/openldap/group-fixtures";
import { userFixtures } from "@/utils/openldap/user-fixtures";
describe("SyncService", () => {
let logService: MockProxy<LogService>;
let i18nService: MockProxy<I18nService>;
@@ -116,7 +115,6 @@ describe("SyncService", () => {
stateService.getLastSyncHash.mockResolvedValue("unique hash");
// @ts-expect-error This is a workaround to make the batchsize smaller to trigger the batching logic since its a const.
// eslint-disable-next-line no-import-assign
constants.batchSize = 4;
const syncResult = await syncService.sync(false, false);
@@ -125,13 +123,9 @@ describe("SyncService", () => {
expect(apiService.postPublicImportDirectory).toHaveBeenCalledWith(
expect.objectContaining({ overwriteExisting: false }),
);
// The expected number of calls may change if more data is added to the ldif
// Make sure it equals (number of users / 4) + (number of groups / 4)
expect(apiService.postPublicImportDirectory).toHaveBeenCalledTimes(7);
expect(apiService.postPublicImportDirectory).toHaveBeenCalledTimes(6);
// @ts-expect-error Reset batch size to original state.
// eslint-disable-next-line no-import-assign
constants.batchSize = originalBatchSize;
});
});

View File

@@ -6,20 +6,20 @@ import { MessagingService } from "@/jslib/common/src/abstractions/messaging.serv
import { OrganizationImportRequest } from "@/jslib/common/src/models/request/organizationImportRequest";
import { ApiService } from "@/jslib/common/src/services/api.service";
import { getSyncConfiguration } from "../../utils/openldap/config-fixtures";
import { DirectoryFactoryService } from "../abstractions/directory-factory.service";
import { DirectoryType } from "../enums/directoryType";
import { getSyncConfiguration } from "../utils/test-fixtures";
import { BatchRequestBuilder } from "./batch-request-builder";
import { LdapDirectoryService } from "./directory-services/ldap-directory.service";
import { I18nService } from "./i18n.service";
import { LdapDirectoryService } from "./ldap-directory.service";
import { SingleRequestBuilder } from "./single-request-builder";
import { StateService } from "./state.service";
import { SyncService } from "./sync.service";
import * as constants from "./sync.service";
import { groupFixtures } from "@/utils/openldap/group-fixtures";
import { userFixtures } from "@/utils/openldap/user-fixtures";
import { groupFixtures } from "@/openldap/group-fixtures";
import { userFixtures } from "@/openldap/user-fixtures";
describe("SyncService", () => {
let cryptoFunctionService: MockProxy<CryptoFunctionService>;
@@ -97,7 +97,6 @@ describe("SyncService", () => {
stateService.getLastSyncHash.mockResolvedValue("unique hash");
// @ts-expect-error This is a workaround to make the batchsize smaller to trigger the batching logic since its a const.
// eslint-disable-next-line no-import-assign
constants.batchSize = 4;
const mockRequests = new Array(6).fill({
@@ -120,7 +119,6 @@ describe("SyncService", () => {
expect(apiService.postPublicImportDirectory).toHaveBeenCalledWith(mockRequests[5]);
// @ts-expect-error Reset batch size back to original value.
// eslint-disable-next-line no-import-assign
constants.batchSize = originalBatchSize;
});

View File

@@ -1,7 +1,7 @@
import { GetUniqueString } from "@/jslib/common/spec/utils";
import { GroupEntry } from "../src/models/groupEntry";
import { UserEntry } from "../src/models/userEntry";
import { GroupEntry } from "../models/groupEntry";
import { UserEntry } from "../models/userEntry";
export function userSimulator(userCount: number): UserEntry[] {
const users: UserEntry[] = [];

View File

@@ -1,5 +1,5 @@
import { LdapConfiguration } from "../../src/models/ldapConfiguration";
import { SyncConfiguration } from "../../src/models/syncConfiguration";
import { LdapConfiguration } from "../models/ldapConfiguration";
import { SyncConfiguration } from "../models/syncConfiguration";
/**
* @returns a basic ldap configuration without TLS/SSL enabled. Can be overridden by passing in a partial configuration.

View File

@@ -1,4 +0,0 @@
GOOGLE_DOMAIN=
GOOGLE_ADMIN_USER=
GOOGLE_CLIENT_EMAIL=
GOOGLE_PRIVATE_KEY=

View File

@@ -1,56 +0,0 @@
import { GSuiteConfiguration } from "../../src/models/gsuiteConfiguration";
import { SyncConfiguration } from "../../src/models/syncConfiguration";
/**
* @returns a basic GSuite configuration. Can be overridden by passing in a partial configuration.
*/
export const getGSuiteConfiguration = (
config?: Partial<GSuiteConfiguration>,
): GSuiteConfiguration => {
const adminUser = process.env.GOOGLE_ADMIN_USER;
const clientEmail = process.env.GOOGLE_CLIENT_EMAIL;
const privateKey = process.env.GOOGLE_PRIVATE_KEY;
const domain = process.env.GOOGLE_DOMAIN;
if (!adminUser || !clientEmail || !privateKey || !domain) {
throw new Error("Google Workspace integration test credentials not configured.");
}
return {
// TODO
adminUser,
clientEmail,
privateKey,
domain: domain,
customer: "",
...(config ?? {}),
};
};
/**
* @returns a basic Google Workspace sync configuration. Can be overridden by passing in a partial configuration.
*/
export const getSyncConfiguration = (config?: Partial<SyncConfiguration>): SyncConfiguration => ({
users: false,
groups: false,
interval: 5,
userFilter: "",
groupFilter: "",
removeDisabled: false,
overwriteExisting: false,
largeImport: false,
// Ldap properties - not optional for some reason
groupObjectClass: "",
userObjectClass: "",
groupPath: null,
userPath: null,
groupNameAttribute: "",
userEmailAttribute: "",
memberAttribute: "",
useEmailPrefixSuffix: false,
emailPrefixAttribute: "",
emailSuffix: null,
creationDateAttribute: "",
revisionDateAttribute: "",
...(config ?? {}),
});

View File

@@ -1,26 +0,0 @@
import { Jsonify } from "type-fest";
import { GroupEntry } from "../../src/models/groupEntry";
// These must match the Google Workspace seed data
const data: Jsonify<GroupEntry>[] = [
{
externalId: "0319y80a3anpxhj",
groupMemberReferenceIds: [],
name: "Integration Test Group A",
referenceId: "0319y80a3anpxhj",
userMemberExternalIds: ["111605910541641314041", "111147009830456099026"],
users: [],
},
{
externalId: "02afmg28317uyub",
groupMemberReferenceIds: [],
name: "Integration Test Group B",
referenceId: "02afmg28317uyub",
userMemberExternalIds: ["111147009830456099026", "100150970267699397306"],
users: [],
},
];
export const groupFixtures = data.map((g) => GroupEntry.fromJSON(g));

View File

@@ -1,50 +0,0 @@
import { Jsonify } from "type-fest";
import { UserEntry } from "../../src/models/userEntry";
// These must match the Google Workspace seed data
const data: Jsonify<UserEntry>[] = [
// In Group A
{
deleted: false,
disabled: false,
email: "testuser1@bwrox.dev",
externalId: "111605910541641314041",
referenceId: "111605910541641314041",
},
// In Groups A + B
{
deleted: false,
disabled: false,
email: "testuser2@bwrox.dev",
externalId: "111147009830456099026",
referenceId: "111147009830456099026",
},
// In Group B
{
deleted: false,
disabled: false,
email: "testuser3@bwrox.dev",
externalId: "100150970267699397306",
referenceId: "100150970267699397306",
},
// Not in a group
{
deleted: false,
disabled: false,
email: "testuser4@bwrox.dev",
externalId: "113764752650306721470",
referenceId: "113764752650306721470",
},
// Disabled user
{
deleted: false,
disabled: true,
email: "testuser5@bwrox.dev",
externalId: "110381976819725658200",
referenceId: "110381976819725658200",
},
];
export const userFixtures = data.map((g) => UserEntry.fromJSON(g));

View File

@@ -1,10 +0,0 @@
if ! [ -x "$(command -v mkcert)" ]; then
echo 'Error: mkcert is not installed. Install mkcert first and then re-run this script.'
echo 'e.g. brew install mkcert'
exit 1
fi
mkcert -install
mkdir -p ./utils/openldap/certs
cp "$(mkcert -CAROOT)/rootCA.pem" ./utils/openldap/certs/rootCA.pem
mkcert -key-file ./utils/openldap/certs/openldap-key.pem -cert-file ./utils/openldap/certs/openldap.pem localhost openldap