mirror of
https://github.com/bitwarden/directory-connector
synced 2026-01-10 20:43:52 +00:00
Compare commits
3 Commits
jared/type
...
gsuite-doc
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
5761a391f7 | ||
|
|
8cd2850e8d | ||
|
|
21ce02f431 |
300
docs/google-workspace.md
Normal file
300
docs/google-workspace.md
Normal file
@@ -0,0 +1,300 @@
|
||||
# Google Workspace Directory Integration
|
||||
|
||||
This document provides technical documentation for the Google Workspace (formerly G Suite) directory integration in Bitwarden Directory Connector.
|
||||
|
||||
## Overview
|
||||
|
||||
The Google Workspace integration synchronizes users and groups from Google Workspace to Bitwarden organizations using the Google Admin SDK Directory API. The service uses a service account with domain-wide delegation to authenticate and access directory data.
|
||||
|
||||
## Architecture
|
||||
|
||||
### Service Location
|
||||
|
||||
- **Implementation**: `src/services/directory-services/gsuite-directory.service.ts`
|
||||
- **Configuration Model**: `src/models/gsuiteConfiguration.ts`
|
||||
- **Integration Tests**: `src/services/directory-services/gsuite-directory.service.integration.spec.ts`
|
||||
|
||||
### Authentication Flow
|
||||
|
||||
The Google Workspace integration uses **OAuth 2.0 with Service Accounts** and domain-wide delegation:
|
||||
|
||||
1. A service account is created in Google Cloud Console
|
||||
2. The service account is granted domain-wide delegation authority
|
||||
3. The service account is authorized for specific OAuth scopes in Google Workspace Admin Console
|
||||
4. The Directory Connector uses the service account's private key to generate JWT tokens
|
||||
5. JWT tokens are exchanged for access tokens to call the Admin SDK APIs
|
||||
|
||||
### Required OAuth Scopes
|
||||
|
||||
The service account must be granted the following OAuth 2.0 scopes:
|
||||
|
||||
```
|
||||
https://www.googleapis.com/auth/admin.directory.user.readonly
|
||||
https://www.googleapis.com/auth/admin.directory.group.readonly
|
||||
https://www.googleapis.com/auth/admin.directory.group.member.readonly
|
||||
```
|
||||
|
||||
## Configuration
|
||||
|
||||
### Required Fields
|
||||
|
||||
| Field | Description |
|
||||
| ------------- | --------------------------------------------------------------------------------------- |
|
||||
| `clientEmail` | Service account email address (e.g., `service-account@project.iam.gserviceaccount.com`) |
|
||||
| `privateKey` | Service account private key in PEM format |
|
||||
| `adminUser` | Admin user email to impersonate for domain-wide delegation |
|
||||
| `domain` | Primary domain of the Google Workspace organization |
|
||||
|
||||
### Optional Fields
|
||||
|
||||
| Field | Description |
|
||||
| ---------- | ---------------------------------------------------------- |
|
||||
| `customer` | Customer ID for multi-domain organizations (rarely needed) |
|
||||
|
||||
### Example Configuration
|
||||
|
||||
```typescript
|
||||
{
|
||||
clientEmail: "directory-connector@my-project.iam.gserviceaccount.com",
|
||||
privateKey: "-----BEGIN PRIVATE KEY-----\n...\n-----END PRIVATE KEY-----\n",
|
||||
adminUser: "admin@example.com",
|
||||
domain: "example.com",
|
||||
customer: "" // Usually not required
|
||||
}
|
||||
```
|
||||
|
||||
## Setup Instructions
|
||||
|
||||
### 1. Create a Service Account
|
||||
|
||||
1. Go to [Google Cloud Console](https://console.cloud.google.com)
|
||||
2. Create or select a project
|
||||
3. Navigate to **IAM & Admin** > **Service Accounts**
|
||||
4. Click **Create Service Account**
|
||||
5. Enter a name and description
|
||||
6. Click **Create and Continue**
|
||||
7. Skip granting roles (not needed for this use case)
|
||||
8. Click **Done**
|
||||
|
||||
### 2. Generate Service Account Key
|
||||
|
||||
1. Click on the newly created service account
|
||||
2. Navigate to the **Keys** tab
|
||||
3. Click **Add Key** > **Create new key**
|
||||
4. Select **JSON** format
|
||||
5. Click **Create** and download the key file
|
||||
6. Extract `client_email` and `private_key` from the JSON file
|
||||
|
||||
### 3. Enable Domain-Wide Delegation
|
||||
|
||||
1. In the service account details, click **Show Advanced Settings**
|
||||
2. Under **Domain-wide delegation**, click **Enable Google Workspace Domain-wide Delegation**
|
||||
3. Note the **Client ID** (numeric ID)
|
||||
|
||||
### 4. Authorize the Service Account in Google Workspace
|
||||
|
||||
1. Go to [Google Workspace Admin Console](https://admin.google.com)
|
||||
2. Navigate to **Security** > **API Controls** > **Domain-wide Delegation**
|
||||
3. Click **Add new**
|
||||
4. Enter the **Client ID** from step 3
|
||||
5. Enter the following OAuth scopes (comma-separated):
|
||||
```
|
||||
https://www.googleapis.com/auth/admin.directory.user.readonly,
|
||||
https://www.googleapis.com/auth/admin.directory.group.readonly,
|
||||
https://www.googleapis.com/auth/admin.directory.group.member.readonly
|
||||
```
|
||||
6. Click **Authorize**
|
||||
|
||||
### 5. Configure Directory Connector
|
||||
|
||||
Use the extracted values to configure the Directory Connector:
|
||||
|
||||
- **Client Email**: From `client_email` in the JSON key file
|
||||
- **Private Key**: From `private_key` in the JSON key file (keep the `\n` line breaks)
|
||||
- **Admin User**: Email of a super admin user in your Google Workspace domain
|
||||
- **Domain**: Your primary Google Workspace domain
|
||||
|
||||
## Sync Behavior
|
||||
|
||||
### User Synchronization
|
||||
|
||||
The service synchronizes the following user attributes:
|
||||
|
||||
| Google Workspace Field | Bitwarden Field | Notes |
|
||||
| ------------------------- | --------------------------- | ----------------------------------------- |
|
||||
| `id` | `referenceId`, `externalId` | User's unique Google ID |
|
||||
| `primaryEmail` | `email` | Normalized to lowercase |
|
||||
| `suspended` OR `archived` | `disabled` | User is disabled if suspended or archived |
|
||||
| Deleted status | `deleted` | Set to true for deleted users |
|
||||
|
||||
**Special Behavior:**
|
||||
|
||||
- The service queries both **active users** and **deleted users** separately
|
||||
- Suspended and archived users are included but marked as disabled
|
||||
- Deleted users are included with the `deleted` flag set to true
|
||||
|
||||
### Group Synchronization
|
||||
|
||||
The service synchronizes the following group attributes:
|
||||
|
||||
| Google Workspace Field | Bitwarden Field | Notes |
|
||||
| ----------------------- | --------------------------- | ------------------------ |
|
||||
| `id` | `referenceId`, `externalId` | Group's unique Google ID |
|
||||
| `name` | `name` | Group display name |
|
||||
| Members (type=USER) | `userMemberExternalIds` | Individual user members |
|
||||
| Members (type=GROUP) | `groupMemberReferenceIds` | Nested group members |
|
||||
| Members (type=CUSTOMER) | `userMemberExternalIds` | All domain users |
|
||||
|
||||
**Member Types:**
|
||||
|
||||
- **USER**: Individual user accounts (only ACTIVE status users are synced)
|
||||
- **GROUP**: Nested groups (allows group hierarchy)
|
||||
- **CUSTOMER**: Special member type that includes all users in the domain
|
||||
|
||||
### Filtering
|
||||
|
||||
#### User Filter Examples
|
||||
|
||||
```
|
||||
exclude:testuser1@bwrox.dev | testuser1@bwrox.dev # Exclude multiple users
|
||||
|orgUnitPath='/Integration testing' # Users in Integration testing Organizational unit (OU)
|
||||
exclude:testuser1@bwrox.dev | orgUnitPath='/Integration testing' # Combined filter: get users in OU excluding provided user
|
||||
|email:testuser* # Users with email starting with "testuser"
|
||||
```
|
||||
|
||||
#### Group Filter Examples
|
||||
|
||||
An important note for group filters is that it implicitly only syncs users that are in groups. For example, in the case of
|
||||
the integration test data, `admin@bwrox.dev` is not a member of any group. Therefore, the first example filter below will
|
||||
also implicitly exclude `admin@bwrox.dev`, who is not in any group. This is important because when it is paired with an
|
||||
empty user filter, this query may semantically be understood as "sync everyone not in Integration Test Group A," while in
|
||||
practice it means "Only sync members of groups not in integration Test Groups A."
|
||||
|
||||
```
|
||||
exclude:Integration Test Group A # Get all users in groups excluding the provided group.
|
||||
```
|
||||
|
||||
### User AND Group Filter Examples
|
||||
|
||||
```
|
||||
|
||||
```
|
||||
|
||||
**Filter Syntax:**
|
||||
|
||||
- Prefix with `|` for custom filters
|
||||
- Use `:` for pattern matching (supports `*` wildcard)
|
||||
- Combine multiple conditions with spaces (AND logic)
|
||||
|
||||
### Pagination
|
||||
|
||||
The service automatically handles pagination for all API calls:
|
||||
|
||||
- Users API (active and deleted)
|
||||
- Groups API
|
||||
- Group Members API
|
||||
|
||||
Each API call processes all pages using the `nextPageToken` mechanism until no more results are available.
|
||||
|
||||
## Error Handling
|
||||
|
||||
### Common Errors
|
||||
|
||||
| Error | Cause | Resolution |
|
||||
| ---------------------- | ------------------------------------- | ---------------------------------------------------------- |
|
||||
| "dirConfigIncomplete" | Missing required configuration fields | Verify all required fields are provided |
|
||||
| "authenticationFailed" | Invalid credentials or unauthorized | Check service account key and domain-wide delegation setup |
|
||||
| API returns 401/403 | Missing OAuth scopes | Verify scopes are authorized in Admin Console |
|
||||
| API returns 404 | Invalid domain or customer ID | Check domain configuration |
|
||||
|
||||
### Security Considerations
|
||||
|
||||
The service implements the following security measures:
|
||||
|
||||
1. **Credential sanitization**: Error messages do not expose private keys or sensitive credentials
|
||||
2. **Secure authentication**: Uses OAuth 2.0 with JWT tokens, not API keys
|
||||
3. **Read-only access**: Only requires read-only scopes for directory data
|
||||
4. **No credential logging**: Service account credentials are not logged
|
||||
|
||||
## Testing
|
||||
|
||||
### Integration Tests
|
||||
|
||||
Integration tests are located in `src/services/directory-services/gsuite-directory.service.integration.spec.ts`.
|
||||
|
||||
**Test Coverage:**
|
||||
|
||||
- Basic sync (users and groups)
|
||||
- Sync with filters
|
||||
- Users-only sync
|
||||
- Groups-only sync
|
||||
- User filtering scenarios
|
||||
- Group filtering scenarios
|
||||
- Disabled users handling
|
||||
- Group membership scenarios
|
||||
- Error handling
|
||||
|
||||
**Running Integration Tests:**
|
||||
|
||||
Integration tests require live Google Workspace credentials:
|
||||
|
||||
1. Create a `.env` file in the `utils/` folder with:
|
||||
```
|
||||
GOOGLE_ADMIN_USER=admin@example.com
|
||||
GOOGLE_CLIENT_EMAIL=service-account@project.iam.gserviceaccount.com
|
||||
GOOGLE_PRIVATE_KEY="-----BEGIN PRIVATE KEY-----\n...\n-----END PRIVATE KEY-----\n"
|
||||
GOOGLE_DOMAIN=example.com
|
||||
```
|
||||
2. Run tests:
|
||||
|
||||
```bash
|
||||
# Run all integration tests (includes LDAP, Google Workspace, etc.)
|
||||
npm run test:integration
|
||||
|
||||
# Run only Google Workspace integration tests
|
||||
npx jest gsuite-directory.service.integration.spec.ts
|
||||
```
|
||||
|
||||
**Test Data:**
|
||||
|
||||
The integration tests expect specific test data in Google Workspace:
|
||||
|
||||
- **Users**: 5 test users in organizational unit `/Integration testing`
|
||||
- testuser1@bwrox.dev (in Group A)
|
||||
- testuser2@bwrox.dev (in Groups A & B)
|
||||
- testuser3@bwrox.dev (in Group B)
|
||||
- testuser4@bwrox.dev (no groups)
|
||||
- testuser5@bwrox.dev (disabled)
|
||||
|
||||
- **Groups**: 2 test groups with name pattern `Integration*`
|
||||
- Integration Test Group A
|
||||
- Integration Test Group B
|
||||
|
||||
## API Reference
|
||||
|
||||
### Google Admin SDK APIs Used
|
||||
|
||||
- **Users API**: `admin.users.list()`
|
||||
- [Documentation](https://developers.google.com/admin-sdk/directory/reference/rest/v1/users/list)
|
||||
|
||||
- **Groups API**: `admin.groups.list()`
|
||||
- [Documentation](https://developers.google.com/admin-sdk/directory/reference/rest/v1/groups/list)
|
||||
|
||||
- **Members API**: `admin.members.list()`
|
||||
- [Documentation](https://developers.google.com/admin-sdk/directory/reference/rest/v1/members/list)
|
||||
|
||||
### Rate Limits
|
||||
|
||||
Google Workspace Directory API rate limits:
|
||||
|
||||
- Default: 2,400 queries per minute per user, per Google Cloud Project
|
||||
|
||||
The service does not implement rate limiting logic; it relies on API error responses.
|
||||
|
||||
## Resources
|
||||
|
||||
- [Google Admin SDK Directory API Guide](https://developers.google.com/admin-sdk/directory/v1/guides)
|
||||
- [Service Account Authentication](https://developers.google.com/identity/protocols/oauth2/service-account)
|
||||
- [Domain-wide Delegation](https://support.google.com/a/answer/162106)
|
||||
- [Google Workspace Admin Console](https://admin.google.com)
|
||||
- [Bitwarden Directory Connector Documentation](https://bitwarden.com/help/directory-sync/)
|
||||
@@ -15,13 +15,13 @@ describe("SymmetricCryptoKey", () => {
|
||||
describe("guesses encKey from key length", () => {
|
||||
it("AesCbc256_B64", () => {
|
||||
const key = makeStaticByteArray(32);
|
||||
const cryptoKey = new SymmetricCryptoKey(key.buffer as ArrayBuffer);
|
||||
const cryptoKey = new SymmetricCryptoKey(key);
|
||||
|
||||
expect(cryptoKey).toEqual({
|
||||
encKey: key.buffer,
|
||||
encKey: key,
|
||||
encKeyB64: "AAECAwQFBgcICQoLDA0ODxAREhMUFRYXGBkaGxwdHh8=",
|
||||
encType: 0,
|
||||
key: key.buffer,
|
||||
key: key,
|
||||
keyB64: "AAECAwQFBgcICQoLDA0ODxAREhMUFRYXGBkaGxwdHh8=",
|
||||
macKey: null,
|
||||
});
|
||||
@@ -29,38 +29,38 @@ describe("SymmetricCryptoKey", () => {
|
||||
|
||||
it("AesCbc128_HmacSha256_B64", () => {
|
||||
const key = makeStaticByteArray(32);
|
||||
const cryptoKey = new SymmetricCryptoKey(key.buffer as ArrayBuffer, EncryptionType.AesCbc128_HmacSha256_B64);
|
||||
const cryptoKey = new SymmetricCryptoKey(key, EncryptionType.AesCbc128_HmacSha256_B64);
|
||||
|
||||
expect(cryptoKey).toEqual({
|
||||
encKey: key.buffer.slice(0, 16),
|
||||
encKey: key.slice(0, 16),
|
||||
encKeyB64: "AAECAwQFBgcICQoLDA0ODw==",
|
||||
encType: 1,
|
||||
key: key.buffer,
|
||||
key: key,
|
||||
keyB64: "AAECAwQFBgcICQoLDA0ODxAREhMUFRYXGBkaGxwdHh8=",
|
||||
macKey: key.buffer.slice(16, 32),
|
||||
macKey: key.slice(16, 32),
|
||||
macKeyB64: "EBESExQVFhcYGRobHB0eHw==",
|
||||
});
|
||||
});
|
||||
|
||||
it("AesCbc256_HmacSha256_B64", () => {
|
||||
const key = makeStaticByteArray(64);
|
||||
const cryptoKey = new SymmetricCryptoKey(key.buffer as ArrayBuffer);
|
||||
const cryptoKey = new SymmetricCryptoKey(key);
|
||||
|
||||
expect(cryptoKey).toEqual({
|
||||
encKey: key.buffer.slice(0, 32),
|
||||
encKey: key.slice(0, 32),
|
||||
encKeyB64: "AAECAwQFBgcICQoLDA0ODxAREhMUFRYXGBkaGxwdHh8=",
|
||||
encType: 2,
|
||||
key: key.buffer,
|
||||
key: key,
|
||||
keyB64:
|
||||
"AAECAwQFBgcICQoLDA0ODxAREhMUFRYXGBkaGxwdHh8gISIjJCUmJygpKissLS4vMDEyMzQ1Njc4OTo7PD0+Pw==",
|
||||
macKey: key.buffer.slice(32, 64),
|
||||
macKey: key.slice(32, 64),
|
||||
macKeyB64: "ICEiIyQlJicoKSorLC0uLzAxMjM0NTY3ODk6Ozw9Pj8=",
|
||||
});
|
||||
});
|
||||
|
||||
it("unknown length", () => {
|
||||
const t = () => {
|
||||
new SymmetricCryptoKey(makeStaticByteArray(30).buffer as ArrayBuffer);
|
||||
new SymmetricCryptoKey(makeStaticByteArray(30));
|
||||
};
|
||||
|
||||
expect(t).toThrowError("Unable to determine encType.");
|
||||
|
||||
@@ -33,5 +33,5 @@ export function makeStaticByteArray(length: number, start = 0) {
|
||||
for (let i = 0; i < length; i++) {
|
||||
arr[i] = start + i;
|
||||
}
|
||||
return arr;
|
||||
return arr.buffer;
|
||||
}
|
||||
|
||||
@@ -26,9 +26,4 @@ export class NodeUtils {
|
||||
.on("error", (err) => reject(err));
|
||||
});
|
||||
}
|
||||
|
||||
// https://stackoverflow.com/a/31394257
|
||||
static bufferToArrayBuffer(buf: Buffer): ArrayBuffer {
|
||||
return buf.buffer.slice(buf.byteOffset, buf.byteOffset + buf.byteLength) as ArrayBuffer;
|
||||
}
|
||||
}
|
||||
|
||||
@@ -36,7 +36,7 @@ export class Utils {
|
||||
Utils.global = Utils.isNode && !Utils.isBrowser ? global : window;
|
||||
}
|
||||
|
||||
static fromB64ToArray(str: string): Uint8Array {
|
||||
static fromB64ToArray(str: string): Uint8Array<ArrayBuffer> {
|
||||
if (Utils.isNode) {
|
||||
return new Uint8Array(Buffer.from(str, "base64"));
|
||||
} else {
|
||||
@@ -49,11 +49,11 @@ export class Utils {
|
||||
}
|
||||
}
|
||||
|
||||
static fromUrlB64ToArray(str: string): Uint8Array {
|
||||
static fromUrlB64ToArray(str: string): Uint8Array<ArrayBuffer> {
|
||||
return Utils.fromB64ToArray(Utils.fromUrlB64ToB64(str));
|
||||
}
|
||||
|
||||
static fromHexToArray(str: string): Uint8Array {
|
||||
static fromHexToArray(str: string): Uint8Array<ArrayBuffer> {
|
||||
if (Utils.isNode) {
|
||||
return new Uint8Array(Buffer.from(str, "hex"));
|
||||
} else {
|
||||
@@ -65,7 +65,7 @@ export class Utils {
|
||||
}
|
||||
}
|
||||
|
||||
static fromUtf8ToArray(str: string): Uint8Array {
|
||||
static fromUtf8ToArray(str: string): Uint8Array<ArrayBuffer> {
|
||||
if (Utils.isNode) {
|
||||
return new Uint8Array(Buffer.from(str, "utf8"));
|
||||
} else {
|
||||
@@ -78,7 +78,7 @@ export class Utils {
|
||||
}
|
||||
}
|
||||
|
||||
static fromByteStringToArray(str: string): Uint8Array {
|
||||
static fromByteStringToArray(str: string): Uint8Array<ArrayBuffer> {
|
||||
const arr = new Uint8Array(str.length);
|
||||
for (let i = 0; i < str.length; i++) {
|
||||
arr[i] = str.charCodeAt(i);
|
||||
@@ -99,8 +99,8 @@ export class Utils {
|
||||
}
|
||||
}
|
||||
|
||||
static fromBufferToUrlB64(buffer: ArrayBuffer): string {
|
||||
return Utils.fromB64toUrlB64(Utils.fromBufferToB64(buffer));
|
||||
static fromBufferToUrlB64(buffer: Uint8Array<ArrayBuffer>): string {
|
||||
return Utils.fromB64toUrlB64(Utils.fromBufferToB64(buffer.buffer));
|
||||
}
|
||||
|
||||
static fromB64toUrlB64(b64Str: string) {
|
||||
@@ -164,7 +164,7 @@ export class Utils {
|
||||
}
|
||||
|
||||
static fromUtf8ToUrlB64(utfStr: string): string {
|
||||
return Utils.fromBufferToUrlB64(Utils.fromUtf8ToArray(utfStr).buffer as ArrayBuffer);
|
||||
return Utils.fromBufferToUrlB64(Utils.fromUtf8ToArray(utfStr));
|
||||
}
|
||||
|
||||
static fromB64ToUtf8(b64Str: string): string {
|
||||
|
||||
@@ -42,9 +42,9 @@ export class ChallengeResponse extends BaseResponse implements PublicKeyCredenti
|
||||
super(response);
|
||||
this.attestation = this.getResponseProperty("attestation");
|
||||
this.authenticatorSelection = this.getResponseProperty("authenticatorSelection");
|
||||
this.challenge = Utils.fromUrlB64ToArray(this.getResponseProperty("challenge")) as Uint8Array<ArrayBuffer>;
|
||||
this.challenge = Utils.fromUrlB64ToArray(this.getResponseProperty("challenge"));
|
||||
this.excludeCredentials = this.getResponseProperty("excludeCredentials").map((c: any) => {
|
||||
c.id = Utils.fromUrlB64ToArray(c.id).buffer as ArrayBuffer;
|
||||
c.id = Utils.fromUrlB64ToArray(c.id).buffer;
|
||||
return c;
|
||||
});
|
||||
this.extensions = this.getResponseProperty("extensions");
|
||||
|
||||
@@ -109,7 +109,7 @@ export class CryptoService implements CryptoServiceAbstraction {
|
||||
): Promise<SymmetricCryptoKey> {
|
||||
const key = await this.retrieveKeyFromStorage(keySuffix, userId);
|
||||
if (key != null) {
|
||||
const symmetricKey = new SymmetricCryptoKey(Utils.fromB64ToArray(key).buffer as ArrayBuffer);
|
||||
const symmetricKey = new SymmetricCryptoKey(Utils.fromB64ToArray(key).buffer);
|
||||
|
||||
if (!(await this.validateKey(symmetricKey))) {
|
||||
this.logService.warning("Wrong key, throwing away stored key");
|
||||
@@ -512,7 +512,7 @@ export class CryptoService implements CryptoServiceAbstraction {
|
||||
|
||||
let plainBuf: ArrayBuffer;
|
||||
if (typeof plainValue === "string") {
|
||||
plainBuf = Utils.fromUtf8ToArray(plainValue).buffer as ArrayBuffer;
|
||||
plainBuf = Utils.fromUtf8ToArray(plainValue).buffer;
|
||||
} else {
|
||||
plainBuf = plainValue;
|
||||
}
|
||||
@@ -539,7 +539,7 @@ export class CryptoService implements CryptoServiceAbstraction {
|
||||
}
|
||||
|
||||
encBytes.set(new Uint8Array(encValue.data), 1 + encValue.iv.byteLength + macLen);
|
||||
return new EncArrayBuffer(encBytes.buffer as ArrayBuffer);
|
||||
return new EncArrayBuffer(encBytes.buffer);
|
||||
}
|
||||
|
||||
async rsaEncrypt(data: ArrayBuffer, publicKey?: ArrayBuffer): Promise<EncString> {
|
||||
@@ -585,7 +585,7 @@ export class CryptoService implements CryptoServiceAbstraction {
|
||||
throw new Error("encPieces unavailable.");
|
||||
}
|
||||
|
||||
const data = Utils.fromB64ToArray(encPieces[0]).buffer as ArrayBuffer;
|
||||
const data = Utils.fromB64ToArray(encPieces[0]).buffer;
|
||||
const privateKey = privateKeyValue ?? (await this.getPrivateKey());
|
||||
if (privateKey == null) {
|
||||
throw new Error("No private key.");
|
||||
@@ -608,9 +608,9 @@ export class CryptoService implements CryptoServiceAbstraction {
|
||||
}
|
||||
|
||||
async decryptToBytes(encString: EncString, key?: SymmetricCryptoKey): Promise<ArrayBuffer> {
|
||||
const iv = Utils.fromB64ToArray(encString.iv).buffer as ArrayBuffer;
|
||||
const data = Utils.fromB64ToArray(encString.data).buffer as ArrayBuffer;
|
||||
const mac = encString.mac ? Utils.fromB64ToArray(encString.mac).buffer as ArrayBuffer : null;
|
||||
const iv = Utils.fromB64ToArray(encString.iv).buffer;
|
||||
const data = Utils.fromB64ToArray(encString.data).buffer;
|
||||
const mac = encString.mac ? Utils.fromB64ToArray(encString.mac).buffer : null;
|
||||
const decipher = await this.aesDecryptToBytes(encString.encryptionType, data, iv, mac, key);
|
||||
if (decipher == null) {
|
||||
return null;
|
||||
@@ -636,9 +636,9 @@ export class CryptoService implements CryptoServiceAbstraction {
|
||||
|
||||
const encBytes = new Uint8Array(encBuf);
|
||||
const encType = encBytes[0];
|
||||
let ctBytes: Uint8Array = null;
|
||||
let ivBytes: Uint8Array = null;
|
||||
let macBytes: Uint8Array = null;
|
||||
let ctBytes: Uint8Array<ArrayBuffer> = null;
|
||||
let ivBytes: Uint8Array<ArrayBuffer> = null;
|
||||
let macBytes: Uint8Array<ArrayBuffer> = null;
|
||||
|
||||
switch (encType) {
|
||||
case EncryptionType.AesCbc128_HmacSha256_B64:
|
||||
@@ -667,9 +667,9 @@ export class CryptoService implements CryptoServiceAbstraction {
|
||||
|
||||
return await this.aesDecryptToBytes(
|
||||
encType,
|
||||
ctBytes.buffer as ArrayBuffer,
|
||||
ivBytes.buffer as ArrayBuffer,
|
||||
macBytes != null ? macBytes.buffer as ArrayBuffer : null,
|
||||
ctBytes.buffer,
|
||||
ivBytes.buffer,
|
||||
macBytes != null ? macBytes.buffer : null,
|
||||
key,
|
||||
);
|
||||
}
|
||||
@@ -766,7 +766,7 @@ export class CryptoService implements CryptoServiceAbstraction {
|
||||
const macData = new Uint8Array(obj.iv.byteLength + obj.data.byteLength);
|
||||
macData.set(new Uint8Array(obj.iv), 0);
|
||||
macData.set(new Uint8Array(obj.data), obj.iv.byteLength);
|
||||
obj.mac = await this.cryptoFunctionService.hmac(macData.buffer as ArrayBuffer, obj.key.macKey, "sha256");
|
||||
obj.mac = await this.cryptoFunctionService.hmac(macData.buffer, obj.key.macKey, "sha256");
|
||||
}
|
||||
|
||||
return obj;
|
||||
@@ -832,7 +832,7 @@ export class CryptoService implements CryptoServiceAbstraction {
|
||||
macData.set(new Uint8Array(iv), 0);
|
||||
macData.set(new Uint8Array(data), iv.byteLength);
|
||||
const computedMac = await this.cryptoFunctionService.hmac(
|
||||
macData.buffer as ArrayBuffer,
|
||||
macData.buffer,
|
||||
theKey.macKey,
|
||||
"sha256",
|
||||
);
|
||||
@@ -889,7 +889,7 @@ export class CryptoService implements CryptoServiceAbstraction {
|
||||
const macKey = await this.cryptoFunctionService.hkdfExpand(key.key, "mac", 32, "sha256");
|
||||
newKey.set(new Uint8Array(encKey));
|
||||
newKey.set(new Uint8Array(macKey), 32);
|
||||
return new SymmetricCryptoKey(newKey.buffer as ArrayBuffer);
|
||||
return new SymmetricCryptoKey(newKey.buffer);
|
||||
}
|
||||
|
||||
private async hashPhrase(hash: ArrayBuffer, minimumEntropy = 64) {
|
||||
|
||||
@@ -94,7 +94,7 @@ describe("NodeCrypto Function Service", () => {
|
||||
it("should fail with prk too small", async () => {
|
||||
const cryptoFunctionService = new NodeCryptoFunctionService();
|
||||
const f = cryptoFunctionService.hkdfExpand(
|
||||
Utils.fromB64ToArray(prk16Byte).buffer as ArrayBuffer,
|
||||
Utils.fromB64ToArray(prk16Byte).buffer,
|
||||
"info",
|
||||
32,
|
||||
"sha256",
|
||||
@@ -105,7 +105,7 @@ describe("NodeCrypto Function Service", () => {
|
||||
it("should fail with outputByteSize is too large", async () => {
|
||||
const cryptoFunctionService = new NodeCryptoFunctionService();
|
||||
const f = cryptoFunctionService.hkdfExpand(
|
||||
Utils.fromB64ToArray(prk32Byte).buffer as ArrayBuffer,
|
||||
Utils.fromB64ToArray(prk32Byte).buffer,
|
||||
"info",
|
||||
8161,
|
||||
"sha256",
|
||||
@@ -170,9 +170,9 @@ describe("NodeCrypto Function Service", () => {
|
||||
const key = makeStaticByteArray(32);
|
||||
const data = Utils.fromUtf8ToArray("EncryptMe!");
|
||||
const encValue = await nodeCryptoFunctionService.aesEncrypt(
|
||||
data.buffer as ArrayBuffer,
|
||||
iv.buffer as ArrayBuffer,
|
||||
key.buffer as ArrayBuffer,
|
||||
data.buffer,
|
||||
iv.buffer,
|
||||
key.buffer,
|
||||
);
|
||||
expect(Utils.fromBufferToB64(encValue)).toBe("ByUF8vhyX4ddU9gcooznwA==");
|
||||
});
|
||||
@@ -184,11 +184,11 @@ describe("NodeCrypto Function Service", () => {
|
||||
const value = "EncryptMe!";
|
||||
const data = Utils.fromUtf8ToArray(value);
|
||||
const encValue = await nodeCryptoFunctionService.aesEncrypt(
|
||||
data.buffer as ArrayBuffer,
|
||||
iv.buffer as ArrayBuffer,
|
||||
key.buffer as ArrayBuffer,
|
||||
data.buffer,
|
||||
iv.buffer,
|
||||
key.buffer,
|
||||
);
|
||||
const decValue = await nodeCryptoFunctionService.aesDecrypt(encValue, iv.buffer as ArrayBuffer, key.buffer as ArrayBuffer);
|
||||
const decValue = await nodeCryptoFunctionService.aesDecrypt(encValue, iv.buffer, key.buffer);
|
||||
expect(Utils.fromBufferToUtf8(decValue)).toBe(value);
|
||||
});
|
||||
});
|
||||
@@ -196,8 +196,8 @@ describe("NodeCrypto Function Service", () => {
|
||||
describe("aesDecryptFast", () => {
|
||||
it("should successfully decrypt data", async () => {
|
||||
const nodeCryptoFunctionService = new NodeCryptoFunctionService();
|
||||
const iv = Utils.fromBufferToB64(makeStaticByteArray(16).buffer as ArrayBuffer);
|
||||
const symKey = new SymmetricCryptoKey(makeStaticByteArray(32).buffer as ArrayBuffer);
|
||||
const iv = Utils.fromBufferToB64(makeStaticByteArray(16).buffer);
|
||||
const symKey = new SymmetricCryptoKey(makeStaticByteArray(32).buffer);
|
||||
const data = "ByUF8vhyX4ddU9gcooznwA==";
|
||||
const params = nodeCryptoFunctionService.aesDecryptFastParameters(data, iv, null, symKey);
|
||||
const decValue = await nodeCryptoFunctionService.aesDecryptFast(params);
|
||||
@@ -212,9 +212,9 @@ describe("NodeCrypto Function Service", () => {
|
||||
const key = makeStaticByteArray(32);
|
||||
const data = Utils.fromB64ToArray("ByUF8vhyX4ddU9gcooznwA==");
|
||||
const decValue = await nodeCryptoFunctionService.aesDecrypt(
|
||||
data.buffer as ArrayBuffer,
|
||||
iv.buffer as ArrayBuffer,
|
||||
key.buffer as ArrayBuffer,
|
||||
data.buffer,
|
||||
iv.buffer,
|
||||
key.buffer,
|
||||
);
|
||||
expect(Utils.fromBufferToUtf8(decValue)).toBe("EncryptMe!");
|
||||
});
|
||||
@@ -228,11 +228,11 @@ describe("NodeCrypto Function Service", () => {
|
||||
const value = "EncryptMe!";
|
||||
const data = Utils.fromUtf8ToArray(value);
|
||||
const encValue = await nodeCryptoFunctionService.rsaEncrypt(
|
||||
data.buffer as ArrayBuffer,
|
||||
pubKey.buffer as ArrayBuffer,
|
||||
data.buffer,
|
||||
pubKey.buffer,
|
||||
"sha1",
|
||||
);
|
||||
const decValue = await nodeCryptoFunctionService.rsaDecrypt(encValue, privKey.buffer as ArrayBuffer, "sha1");
|
||||
const decValue = await nodeCryptoFunctionService.rsaDecrypt(encValue, privKey.buffer, "sha1");
|
||||
expect(Utils.fromBufferToUtf8(decValue)).toBe(value);
|
||||
});
|
||||
});
|
||||
@@ -248,8 +248,8 @@ describe("NodeCrypto Function Service", () => {
|
||||
"/5jcercUtK2o+XrzNrL4UQ7yLZcFz6Bfwb/j6ICYvqd/YJwXNE6dwlL57OfwJyCdw2rRYf0/qI00t9u8Iitw==",
|
||||
);
|
||||
const decValue = await nodeCryptoFunctionService.rsaDecrypt(
|
||||
data.buffer as ArrayBuffer,
|
||||
privKey.buffer as ArrayBuffer,
|
||||
data.buffer,
|
||||
privKey.buffer,
|
||||
"sha1",
|
||||
);
|
||||
expect(Utils.fromBufferToUtf8(decValue)).toBe("EncryptMe!");
|
||||
@@ -260,7 +260,7 @@ describe("NodeCrypto Function Service", () => {
|
||||
it("should successfully extract key", async () => {
|
||||
const nodeCryptoFunctionService = new NodeCryptoFunctionService();
|
||||
const privKey = Utils.fromB64ToArray(RsaPrivateKey);
|
||||
const publicKey = await nodeCryptoFunctionService.rsaExtractPublicKey(privKey.buffer as ArrayBuffer);
|
||||
const publicKey = await nodeCryptoFunctionService.rsaExtractPublicKey(privKey.buffer);
|
||||
expect(Utils.fromBufferToB64(publicKey)).toBe(RsaPublicKey);
|
||||
});
|
||||
});
|
||||
@@ -326,8 +326,8 @@ function testPbkdf2(
|
||||
it("should create valid " + algorithm + " key from array buffer input", async () => {
|
||||
const cryptoFunctionService = new NodeCryptoFunctionService();
|
||||
const key = await cryptoFunctionService.pbkdf2(
|
||||
Utils.fromUtf8ToArray(regularPassword).buffer as ArrayBuffer,
|
||||
Utils.fromUtf8ToArray(regularEmail).buffer as ArrayBuffer,
|
||||
Utils.fromUtf8ToArray(regularPassword).buffer,
|
||||
Utils.fromUtf8ToArray(regularEmail).buffer,
|
||||
algorithm,
|
||||
5000,
|
||||
);
|
||||
@@ -341,7 +341,7 @@ function testHkdf(
|
||||
utf8Key: string,
|
||||
unicodeKey: string,
|
||||
) {
|
||||
const ikm = Utils.fromB64ToArray("criAmKtfzxanbgea5/kelQ==").buffer as ArrayBuffer;
|
||||
const ikm = Utils.fromB64ToArray("criAmKtfzxanbgea5/kelQ==").buffer;
|
||||
|
||||
const regularSalt = "salt";
|
||||
const utf8Salt = "üser_salt";
|
||||
@@ -373,8 +373,8 @@ function testHkdf(
|
||||
const cryptoFunctionService = new NodeCryptoFunctionService();
|
||||
const key = await cryptoFunctionService.hkdf(
|
||||
ikm,
|
||||
Utils.fromUtf8ToArray(regularSalt).buffer as ArrayBuffer,
|
||||
Utils.fromUtf8ToArray(regularInfo).buffer as ArrayBuffer,
|
||||
Utils.fromUtf8ToArray(regularSalt).buffer,
|
||||
Utils.fromUtf8ToArray(regularInfo).buffer,
|
||||
32,
|
||||
algorithm,
|
||||
);
|
||||
@@ -393,7 +393,7 @@ function testHkdfExpand(
|
||||
it("should create valid " + algorithm + " " + outputByteSize + " byte okm", async () => {
|
||||
const cryptoFunctionService = new NodeCryptoFunctionService();
|
||||
const okm = await cryptoFunctionService.hkdfExpand(
|
||||
Utils.fromB64ToArray(b64prk).buffer as ArrayBuffer,
|
||||
Utils.fromB64ToArray(b64prk).buffer,
|
||||
info,
|
||||
outputByteSize,
|
||||
algorithm,
|
||||
@@ -433,7 +433,7 @@ function testHash(
|
||||
it("should create valid " + algorithm + " hash from array buffer input", async () => {
|
||||
const cryptoFunctionService = new NodeCryptoFunctionService();
|
||||
const hash = await cryptoFunctionService.hash(
|
||||
Utils.fromUtf8ToArray(regularValue).buffer as ArrayBuffer,
|
||||
Utils.fromUtf8ToArray(regularValue).buffer,
|
||||
algorithm,
|
||||
);
|
||||
expect(Utils.fromBufferToHex(hash)).toBe(regularHash);
|
||||
@@ -443,8 +443,8 @@ function testHash(
|
||||
function testHmac(algorithm: "sha1" | "sha256" | "sha512", mac: string, fast = false) {
|
||||
it("should create valid " + algorithm + " hmac", async () => {
|
||||
const cryptoFunctionService = new NodeCryptoFunctionService();
|
||||
const value = Utils.fromUtf8ToArray("SignMe!!").buffer as ArrayBuffer;
|
||||
const key = Utils.fromUtf8ToArray("secretkey").buffer as ArrayBuffer;
|
||||
const value = Utils.fromUtf8ToArray("SignMe!!").buffer;
|
||||
const key = Utils.fromUtf8ToArray("secretkey").buffer;
|
||||
let computedMac: ArrayBuffer = null;
|
||||
if (fast) {
|
||||
computedMac = await cryptoFunctionService.hmacFast(value, key, algorithm);
|
||||
@@ -462,8 +462,8 @@ function testCompare(fast = false) {
|
||||
a[0] = 1;
|
||||
a[1] = 2;
|
||||
const equal = fast
|
||||
? await cryptoFunctionService.compareFast(a.buffer as ArrayBuffer, a.buffer as ArrayBuffer)
|
||||
: await cryptoFunctionService.compare(a.buffer as ArrayBuffer, a.buffer as ArrayBuffer);
|
||||
? await cryptoFunctionService.compareFast(a.buffer, a.buffer)
|
||||
: await cryptoFunctionService.compare(a.buffer, a.buffer);
|
||||
expect(equal).toBe(true);
|
||||
});
|
||||
|
||||
@@ -476,8 +476,8 @@ function testCompare(fast = false) {
|
||||
b[0] = 3;
|
||||
b[1] = 4;
|
||||
const equal = fast
|
||||
? await cryptoFunctionService.compareFast(a.buffer as ArrayBuffer, b.buffer as ArrayBuffer)
|
||||
: await cryptoFunctionService.compare(a.buffer as ArrayBuffer, b.buffer as ArrayBuffer);
|
||||
? await cryptoFunctionService.compareFast(a.buffer, b.buffer)
|
||||
: await cryptoFunctionService.compare(a.buffer, b.buffer);
|
||||
expect(equal).toBe(false);
|
||||
});
|
||||
|
||||
@@ -489,8 +489,8 @@ function testCompare(fast = false) {
|
||||
const b = new Uint8Array(2);
|
||||
b[0] = 3;
|
||||
const equal = fast
|
||||
? await cryptoFunctionService.compareFast(a.buffer as ArrayBuffer, b.buffer as ArrayBuffer)
|
||||
: await cryptoFunctionService.compare(a.buffer as ArrayBuffer, b.buffer as ArrayBuffer);
|
||||
? await cryptoFunctionService.compareFast(a.buffer, b.buffer)
|
||||
: await cryptoFunctionService.compare(a.buffer, b.buffer);
|
||||
expect(equal).toBe(false);
|
||||
});
|
||||
}
|
||||
|
||||
@@ -67,14 +67,14 @@ export class NodeCryptoFunctionService implements CryptoFunctionService {
|
||||
t.set(previousT);
|
||||
t.set(infoArr, previousT.length);
|
||||
t.set([i + 1], t.length - 1);
|
||||
previousT = new Uint8Array(await this.hmac(t.buffer as ArrayBuffer, prk, algorithm));
|
||||
previousT = new Uint8Array(await this.hmac(t.buffer, prk, algorithm));
|
||||
okm.set(previousT, runningOkmLength);
|
||||
runningOkmLength += previousT.length;
|
||||
if (runningOkmLength >= outputByteSize) {
|
||||
break;
|
||||
}
|
||||
}
|
||||
return okm.slice(0, outputByteSize).buffer as ArrayBuffer;
|
||||
return okm.slice(0, outputByteSize).buffer;
|
||||
}
|
||||
|
||||
hash(
|
||||
@@ -147,19 +147,19 @@ export class NodeCryptoFunctionService implements CryptoFunctionService {
|
||||
): DecryptParameters<ArrayBuffer> {
|
||||
const p = new DecryptParameters<ArrayBuffer>();
|
||||
p.encKey = key.encKey;
|
||||
p.data = Utils.fromB64ToArray(data).buffer as ArrayBuffer;
|
||||
p.iv = Utils.fromB64ToArray(iv).buffer as ArrayBuffer;
|
||||
p.data = Utils.fromB64ToArray(data).buffer;
|
||||
p.iv = Utils.fromB64ToArray(iv).buffer;
|
||||
|
||||
const macData = new Uint8Array(p.iv.byteLength + p.data.byteLength);
|
||||
macData.set(new Uint8Array(p.iv), 0);
|
||||
macData.set(new Uint8Array(p.data), p.iv.byteLength);
|
||||
p.macData = macData.buffer as ArrayBuffer;
|
||||
p.macData = macData.buffer;
|
||||
|
||||
if (key.macKey != null) {
|
||||
p.macKey = key.macKey;
|
||||
}
|
||||
if (mac != null) {
|
||||
p.mac = Utils.fromB64ToArray(mac).buffer as ArrayBuffer;
|
||||
p.mac = Utils.fromB64ToArray(mac).buffer;
|
||||
}
|
||||
|
||||
return p;
|
||||
@@ -215,7 +215,7 @@ export class NodeCryptoFunctionService implements CryptoFunctionService {
|
||||
const publicKeyAsn1 = forge.pki.publicKeyToAsn1(forgePublicKey);
|
||||
const publicKeyByteString = forge.asn1.toDer(publicKeyAsn1).data;
|
||||
const publicKeyArray = Utils.fromByteStringToArray(publicKeyByteString);
|
||||
return Promise.resolve(publicKeyArray.buffer as ArrayBuffer);
|
||||
return Promise.resolve(publicKeyArray.buffer);
|
||||
}
|
||||
|
||||
async rsaGenerateKeyPair(length: 1024 | 2048 | 4096): Promise<[ArrayBuffer, ArrayBuffer]> {
|
||||
@@ -241,7 +241,7 @@ export class NodeCryptoFunctionService implements CryptoFunctionService {
|
||||
const privateKeyByteString = forge.asn1.toDer(privateKeyPkcs8).getBytes();
|
||||
const privateKey = Utils.fromByteStringToArray(privateKeyByteString);
|
||||
|
||||
resolve([publicKey.buffer as ArrayBuffer, privateKey.buffer as ArrayBuffer]);
|
||||
resolve([publicKey.buffer, privateKey.buffer]);
|
||||
},
|
||||
);
|
||||
});
|
||||
@@ -276,9 +276,9 @@ export class NodeCryptoFunctionService implements CryptoFunctionService {
|
||||
private toArrayBuffer(value: Buffer | string | ArrayBuffer): ArrayBuffer {
|
||||
let buf: ArrayBuffer;
|
||||
if (typeof value === "string") {
|
||||
buf = Utils.fromUtf8ToArray(value).buffer as ArrayBuffer;
|
||||
buf = Utils.fromUtf8ToArray(value).buffer;
|
||||
} else {
|
||||
buf = new Uint8Array(value).buffer as ArrayBuffer;
|
||||
buf = new Uint8Array(value).buffer;
|
||||
}
|
||||
return buf;
|
||||
}
|
||||
|
||||
@@ -50,36 +50,221 @@ describe("gsuiteDirectoryService", () => {
|
||||
directoryService = new GSuiteDirectoryService(logService, i18nService, stateService);
|
||||
});
|
||||
|
||||
it("syncs without using filters (includes test data)", async () => {
|
||||
const directoryConfig = getGSuiteConfiguration();
|
||||
stateService.getDirectory.calledWith(DirectoryType.GSuite).mockResolvedValue(directoryConfig);
|
||||
describe("basic sync fetching users and groups", () => {
|
||||
it("syncs without using filters (includes test data)", async () => {
|
||||
const directoryConfig = getGSuiteConfiguration();
|
||||
stateService.getDirectory.calledWith(DirectoryType.GSuite).mockResolvedValue(directoryConfig);
|
||||
|
||||
const syncConfig = getSyncConfiguration({
|
||||
groups: true,
|
||||
users: true,
|
||||
const syncConfig = getSyncConfiguration({
|
||||
groups: true,
|
||||
users: true,
|
||||
});
|
||||
stateService.getSync.mockResolvedValue(syncConfig);
|
||||
|
||||
const result = await directoryService.getEntries(true, true);
|
||||
|
||||
expect(result[0]).toEqual(expect.arrayContaining(groupFixtures));
|
||||
expect(result[1]).toEqual(expect.arrayContaining(userFixtures));
|
||||
});
|
||||
stateService.getSync.mockResolvedValue(syncConfig);
|
||||
|
||||
const result = await directoryService.getEntries(true, true);
|
||||
it("syncs using user and group filters (exact match for test data)", async () => {
|
||||
const directoryConfig = getGSuiteConfiguration();
|
||||
stateService.getDirectory.calledWith(DirectoryType.GSuite).mockResolvedValue(directoryConfig);
|
||||
|
||||
expect(result[0]).toEqual(expect.arrayContaining(groupFixtures));
|
||||
expect(result[1]).toEqual(expect.arrayContaining(userFixtures));
|
||||
const syncConfig = getSyncConfiguration({
|
||||
groups: true,
|
||||
users: true,
|
||||
userFilter: INTEGRATION_USER_FILTER,
|
||||
groupFilter: INTEGRATION_GROUP_FILTER,
|
||||
});
|
||||
stateService.getSync.mockResolvedValue(syncConfig);
|
||||
|
||||
const result = await directoryService.getEntries(true, true);
|
||||
|
||||
expect(result).toEqual([groupFixtures, userFixtures]);
|
||||
});
|
||||
|
||||
it("syncs only users when groups sync is disabled", async () => {
|
||||
const directoryConfig = getGSuiteConfiguration();
|
||||
stateService.getDirectory.calledWith(DirectoryType.GSuite).mockResolvedValue(directoryConfig);
|
||||
|
||||
const syncConfig = getSyncConfiguration({
|
||||
groups: false,
|
||||
users: true,
|
||||
userFilter: INTEGRATION_USER_FILTER,
|
||||
});
|
||||
stateService.getSync.mockResolvedValue(syncConfig);
|
||||
|
||||
const result = await directoryService.getEntries(true, true);
|
||||
|
||||
expect(result[0]).toBeUndefined();
|
||||
expect(result[1]).toEqual(expect.arrayContaining(userFixtures));
|
||||
});
|
||||
|
||||
it("syncs only groups when users sync is disabled", async () => {
|
||||
const directoryConfig = getGSuiteConfiguration();
|
||||
stateService.getDirectory.calledWith(DirectoryType.GSuite).mockResolvedValue(directoryConfig);
|
||||
|
||||
const syncConfig = getSyncConfiguration({
|
||||
groups: true,
|
||||
users: false,
|
||||
groupFilter: INTEGRATION_GROUP_FILTER,
|
||||
});
|
||||
stateService.getSync.mockResolvedValue(syncConfig);
|
||||
|
||||
const result = await directoryService.getEntries(true, true);
|
||||
|
||||
expect(result[0]).toEqual(expect.arrayContaining(groupFixtures));
|
||||
expect(result[1]).toEqual([]);
|
||||
});
|
||||
});
|
||||
|
||||
it("syncs using user and group filters (exact match for test data)", async () => {
|
||||
const directoryConfig = getGSuiteConfiguration();
|
||||
stateService.getDirectory.calledWith(DirectoryType.GSuite).mockResolvedValue(directoryConfig);
|
||||
describe("users", () => {
|
||||
it("includes disabled users in sync results", async () => {
|
||||
const directoryConfig = getGSuiteConfiguration();
|
||||
stateService.getDirectory.calledWith(DirectoryType.GSuite).mockResolvedValue(directoryConfig);
|
||||
|
||||
const syncConfig = getSyncConfiguration({
|
||||
groups: true,
|
||||
users: true,
|
||||
userFilter: INTEGRATION_USER_FILTER,
|
||||
groupFilter: INTEGRATION_GROUP_FILTER,
|
||||
const syncConfig = getSyncConfiguration({
|
||||
users: true,
|
||||
userFilter: INTEGRATION_USER_FILTER,
|
||||
});
|
||||
stateService.getSync.mockResolvedValue(syncConfig);
|
||||
|
||||
const result = await directoryService.getEntries(true, true);
|
||||
|
||||
const disabledUser = userFixtures.find((u) => u.email === "testuser5@bwrox.dev");
|
||||
expect(result[1]).toContainEqual(disabledUser);
|
||||
expect(disabledUser.disabled).toBe(true);
|
||||
});
|
||||
stateService.getSync.mockResolvedValue(syncConfig);
|
||||
|
||||
const result = await directoryService.getEntries(true, true);
|
||||
it("filters users by org unit path", async () => {
|
||||
const directoryConfig = getGSuiteConfiguration();
|
||||
stateService.getDirectory.calledWith(DirectoryType.GSuite).mockResolvedValue(directoryConfig);
|
||||
|
||||
expect(result).toEqual([groupFixtures, userFixtures]);
|
||||
const syncConfig = getSyncConfiguration({
|
||||
users: true,
|
||||
userFilter: INTEGRATION_USER_FILTER,
|
||||
});
|
||||
stateService.getSync.mockResolvedValue(syncConfig);
|
||||
|
||||
const result = await directoryService.getEntries(true, true);
|
||||
|
||||
expect(result[1]).toEqual(userFixtures);
|
||||
expect(result[1].length).toBe(5);
|
||||
});
|
||||
|
||||
it("filters users by email pattern", async () => {
|
||||
const directoryConfig = getGSuiteConfiguration();
|
||||
stateService.getDirectory.calledWith(DirectoryType.GSuite).mockResolvedValue(directoryConfig);
|
||||
|
||||
const syncConfig = getSyncConfiguration({
|
||||
users: true,
|
||||
userFilter: "|email:testuser1*",
|
||||
});
|
||||
stateService.getSync.mockResolvedValue(syncConfig);
|
||||
|
||||
const result = await directoryService.getEntries(true, true);
|
||||
|
||||
const testuser1 = userFixtures.find((u) => u.email === "testuser1@bwrox.dev");
|
||||
expect(result[1]).toContainEqual(testuser1);
|
||||
expect(result[1].length).toBeGreaterThanOrEqual(1);
|
||||
});
|
||||
});
|
||||
|
||||
describe("groups", () => {
|
||||
it("filters groups by name pattern", async () => {
|
||||
const directoryConfig = getGSuiteConfiguration();
|
||||
stateService.getDirectory.calledWith(DirectoryType.GSuite).mockResolvedValue(directoryConfig);
|
||||
|
||||
const syncConfig = getSyncConfiguration({
|
||||
groups: true,
|
||||
users: true,
|
||||
userFilter: INTEGRATION_USER_FILTER,
|
||||
groupFilter: INTEGRATION_GROUP_FILTER,
|
||||
});
|
||||
stateService.getSync.mockResolvedValue(syncConfig);
|
||||
|
||||
const result = await directoryService.getEntries(true, true);
|
||||
|
||||
expect(result[0]).toEqual(groupFixtures);
|
||||
expect(result[0].length).toBe(2);
|
||||
});
|
||||
|
||||
it("includes group members correctly", async () => {
|
||||
const directoryConfig = getGSuiteConfiguration();
|
||||
stateService.getDirectory.calledWith(DirectoryType.GSuite).mockResolvedValue(directoryConfig);
|
||||
|
||||
const syncConfig = getSyncConfiguration({
|
||||
groups: true,
|
||||
users: true,
|
||||
userFilter: INTEGRATION_USER_FILTER,
|
||||
groupFilter: INTEGRATION_GROUP_FILTER,
|
||||
});
|
||||
stateService.getSync.mockResolvedValue(syncConfig);
|
||||
|
||||
const result = await directoryService.getEntries(true, true);
|
||||
|
||||
const groupA = result[0].find((g) => g.name === "Integration Test Group A");
|
||||
expect(groupA).toBeDefined();
|
||||
expect(groupA.userMemberExternalIds.size).toBe(2);
|
||||
expect(groupA.userMemberExternalIds.has("111605910541641314041")).toBe(true);
|
||||
expect(groupA.userMemberExternalIds.has("111147009830456099026")).toBe(true);
|
||||
|
||||
const groupB = result[0].find((g) => g.name === "Integration Test Group B");
|
||||
expect(groupB).toBeDefined();
|
||||
expect(groupB.userMemberExternalIds.size).toBe(2);
|
||||
expect(groupB.userMemberExternalIds.has("111147009830456099026")).toBe(true);
|
||||
expect(groupB.userMemberExternalIds.has("100150970267699397306")).toBe(true);
|
||||
});
|
||||
|
||||
it("handles groups with no members", async () => {
|
||||
const directoryConfig = getGSuiteConfiguration();
|
||||
stateService.getDirectory.calledWith(DirectoryType.GSuite).mockResolvedValue(directoryConfig);
|
||||
|
||||
const syncConfig = getSyncConfiguration({
|
||||
groups: true,
|
||||
users: true,
|
||||
userFilter: INTEGRATION_USER_FILTER,
|
||||
groupFilter: "|name:Integration*",
|
||||
});
|
||||
stateService.getSync.mockResolvedValue(syncConfig);
|
||||
|
||||
const result = await directoryService.getEntries(true, true);
|
||||
|
||||
// All test groups should have members, but ensure the code handles empty groups
|
||||
expect(result[0]).toBeDefined();
|
||||
expect(Array.isArray(result[0])).toBe(true);
|
||||
});
|
||||
});
|
||||
|
||||
describe("error handling", () => {
|
||||
it("throws error when directory configuration is incomplete", async () => {
|
||||
stateService.getDirectory.calledWith(DirectoryType.GSuite).mockResolvedValue(
|
||||
getGSuiteConfiguration({
|
||||
clientEmail: "",
|
||||
}),
|
||||
);
|
||||
|
||||
const syncConfig = getSyncConfiguration({
|
||||
users: true,
|
||||
});
|
||||
stateService.getSync.mockResolvedValue(syncConfig);
|
||||
|
||||
await expect(directoryService.getEntries(true, true)).rejects.toThrow();
|
||||
});
|
||||
|
||||
it("throws error when authentication fails with invalid credentials", async () => {
|
||||
const directoryConfig = getGSuiteConfiguration({
|
||||
privateKey: "-----BEGIN PRIVATE KEY-----\nINVALID_KEY\n-----END PRIVATE KEY-----\n",
|
||||
});
|
||||
stateService.getDirectory.calledWith(DirectoryType.GSuite).mockResolvedValue(directoryConfig);
|
||||
|
||||
const syncConfig = getSyncConfiguration({
|
||||
users: true,
|
||||
});
|
||||
stateService.getSync.mockResolvedValue(syncConfig);
|
||||
|
||||
await expect(directoryService.getEntries(true, true)).rejects.toThrow();
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
@@ -14,6 +14,22 @@ import { BaseDirectoryService } from "../baseDirectory.service";
|
||||
|
||||
import { IDirectoryService } from "./directory.service";
|
||||
|
||||
/**
|
||||
* Google Workspace (formerly G Suite) Directory Service
|
||||
*
|
||||
* This service integrates with Google Workspace to synchronize users and groups
|
||||
* to Bitwarden organizations using the Google Admin SDK Directory API.
|
||||
*
|
||||
* @remarks
|
||||
* Authentication is performed using a service account with domain-wide delegation.
|
||||
* The service account must be granted the following OAuth 2.0 scopes:
|
||||
* - https://www.googleapis.com/auth/admin.directory.user.readonly
|
||||
* - https://www.googleapis.com/auth/admin.directory.group.readonly
|
||||
* - https://www.googleapis.com/auth/admin.directory.group.member.readonly
|
||||
*
|
||||
* @see {@link https://developers.google.com/admin-sdk/directory/v1/guides | Google Admin SDK Directory API}
|
||||
* @see {@link https://support.google.com/a/answer/162106 | Domain-wide delegation of authority}
|
||||
*/
|
||||
export class GSuiteDirectoryService extends BaseDirectoryService implements IDirectoryService {
|
||||
private client: JWT;
|
||||
private service: admin_directory_v1.Admin;
|
||||
@@ -30,6 +46,29 @@ export class GSuiteDirectoryService extends BaseDirectoryService implements IDir
|
||||
this.service = google.admin("directory_v1");
|
||||
}
|
||||
|
||||
/**
|
||||
* Retrieves users and groups from Google Workspace directory
|
||||
* @returns A tuple containing [groups, users] arrays
|
||||
*
|
||||
* @remarks
|
||||
* This function:
|
||||
* 1. Validates the directory type matches GSuite
|
||||
* 2. Loads directory and sync configuration
|
||||
* 3. Authenticates with Google Workspace using service account credentials
|
||||
* 4. Retrieves users (if enabled in sync config)
|
||||
* 5. Retrieves groups and their members (if enabled in sync config)
|
||||
* 6. Applies any user/group filters specified in sync configuration
|
||||
*
|
||||
* User and group filters follow Google Workspace Directory API query syntax:
|
||||
* - Use `|` prefix for custom filters (e.g., "|orgUnitPath='/Engineering'")
|
||||
* - Multiple conditions can be combined with AND/OR operators
|
||||
*
|
||||
* @example
|
||||
* ```typescript
|
||||
* const [groups, users] = await service.getEntries(true, false);
|
||||
* console.log(`Synced ${users.length} users and ${groups.length} groups`);
|
||||
* ```
|
||||
*/
|
||||
async getEntries(force: boolean, test: boolean): Promise<[GroupEntry[], UserEntry[]]> {
|
||||
const type = await this.stateService.getDirectoryType();
|
||||
if (type !== DirectoryType.GSuite) {
|
||||
@@ -65,6 +104,26 @@ export class GSuiteDirectoryService extends BaseDirectoryService implements IDir
|
||||
return [groups, users];
|
||||
}
|
||||
|
||||
/**
|
||||
* Retrieves all users from Google Workspace directory
|
||||
*
|
||||
* @returns Array of UserEntry objects representing users in the directory
|
||||
*
|
||||
* @remarks
|
||||
* This method performs two separate queries:
|
||||
* 1. Active users (including suspended and archived)
|
||||
* 2. Deleted users (marked with deleted flag)
|
||||
*
|
||||
* The method handles pagination automatically, fetching all pages of results.
|
||||
* Users are filtered based on the userFilter specified in sync configuration.
|
||||
*
|
||||
* User properties mapped:
|
||||
* - referenceId: User's unique Google ID
|
||||
* - externalId: User's unique Google ID (same as referenceId)
|
||||
* - email: User's primary email address (lowercase)
|
||||
* - disabled: True if user is suspended or archived
|
||||
* - deleted: True if user is deleted from the directory
|
||||
*/
|
||||
private async getUsers(): Promise<UserEntry[]> {
|
||||
const entries: UserEntry[] = [];
|
||||
const query = this.createDirectoryQuery(this.syncConfig.userFilter);
|
||||
@@ -132,6 +191,13 @@ export class GSuiteDirectoryService extends BaseDirectoryService implements IDir
|
||||
return entries;
|
||||
}
|
||||
|
||||
/**
|
||||
* Transforms a Google Workspace user object into a UserEntry
|
||||
*
|
||||
* @param user - Google Workspace user object from the API
|
||||
* @param deleted - Whether this user is from the deleted users list
|
||||
* @returns UserEntry object or null if user data is invalid
|
||||
*/
|
||||
private buildUser(user: admin_directory_v1.Schema$User, deleted: boolean) {
|
||||
if ((user.emails == null || user.emails === "") && !deleted) {
|
||||
return null;
|
||||
@@ -146,6 +212,17 @@ export class GSuiteDirectoryService extends BaseDirectoryService implements IDir
|
||||
return entry;
|
||||
}
|
||||
|
||||
/**
|
||||
* Retrieves all groups from Google Workspace directory
|
||||
*
|
||||
* @param setFilter - Tuple of [isWhitelist, Set<string>] for filtering groups
|
||||
* @param users - Array of UserEntry objects to reference when processing members
|
||||
* @returns Array of GroupEntry objects representing groups in the directory
|
||||
*
|
||||
* @remarks
|
||||
* For each group, the method also retrieves all group members by calling the
|
||||
* members API. Groups are filtered based on the groupFilter in sync configuration.
|
||||
*/
|
||||
private async getGroups(
|
||||
setFilter: [boolean, Set<string>],
|
||||
users: UserEntry[],
|
||||
@@ -185,6 +262,19 @@ export class GSuiteDirectoryService extends BaseDirectoryService implements IDir
|
||||
return entries;
|
||||
}
|
||||
|
||||
/**
|
||||
* Transforms a Google Workspace group object into a GroupEntry with members
|
||||
*
|
||||
* @param group - Google Workspace group object from the API
|
||||
* @param users - Array of UserEntry objects for reference
|
||||
* @returns GroupEntry object with all members populated
|
||||
*
|
||||
* @remarks
|
||||
* This method retrieves all members of the group, handling three member types:
|
||||
* - USER: Individual user members (only active status users are included)
|
||||
* - GROUP: Nested group members
|
||||
* - CUSTOMER: Special type that includes all users in the domain
|
||||
*/
|
||||
private async buildGroup(group: admin_directory_v1.Schema$Group, users: UserEntry[]) {
|
||||
let nextPageToken: string = null;
|
||||
|
||||
@@ -230,6 +320,26 @@ export class GSuiteDirectoryService extends BaseDirectoryService implements IDir
|
||||
return entry;
|
||||
}
|
||||
|
||||
/**
|
||||
* Authenticates with Google Workspace using service account credentials
|
||||
*
|
||||
* @throws Error if required configuration fields are missing or authentication fails
|
||||
*
|
||||
* @remarks
|
||||
* Authentication uses a JWT with the following required fields:
|
||||
* - clientEmail: Service account email address
|
||||
* - privateKey: Service account private key (PEM format)
|
||||
* - subject: Admin user email to impersonate (for domain-wide delegation)
|
||||
*
|
||||
* The service account must be configured with domain-wide delegation and granted
|
||||
* the required OAuth scopes in the Google Workspace Admin Console.
|
||||
*
|
||||
* Optional configuration:
|
||||
* - domain: Filters results to a specific domain
|
||||
* - customer: Customer ID for multi-domain organizations
|
||||
*
|
||||
* @see {@link https://developers.google.com/identity/protocols/oauth2/service-account | Service account authentication}
|
||||
*/
|
||||
private async auth() {
|
||||
if (
|
||||
this.dirConfig.clientEmail == null ||
|
||||
|
||||
Reference in New Issue
Block a user