diff --git a/website/docs/docs/build/sl-getting-started.md b/website/docs/docs/build/sl-getting-started.md index d5a59c33ec2..4274fccf509 100644 --- a/website/docs/docs/build/sl-getting-started.md +++ b/website/docs/docs/build/sl-getting-started.md @@ -74,21 +74,9 @@ import SlSetUp from '/snippets/_new-sl-setup.md'; If you're encountering some issues when defining your metrics or setting up the dbt Semantic Layer, check out a list of answers to some of the questions or problems you may be experiencing. -
- How do I migrate from the legacy Semantic Layer to the new one? -
-
If you're using the legacy Semantic Layer, we highly recommend you upgrade your dbt version to dbt v1.6 or higher to use the new dbt Semantic Layer. Refer to the dedicated migration guide for more info.
-
-
-
-How are you storing my data? -User data passes through the Semantic Layer on its way back from the warehouse. dbt Labs ensures security by authenticating through the customer's data warehouse. Currently, we don't cache data for the long term, but it might temporarily stay in the system for up to 10 minutes, usually less. In the future, we'll introduce a caching feature that allows us to cache data on our infrastructure for up to 24 hours. -
- -
-Is the dbt Semantic Layer open source? -The dbt Semantic Layer is proprietary; however, some components of the dbt Semantic Layer are open source, such as dbt-core and MetricFlow.

dbt Cloud Developer or dbt Core users can define metrics in their project, including a local dbt Core project, using the dbt Cloud IDE, dbt Cloud CLI, or dbt Core CLI. However, to experience the universal dbt Semantic Layer and access those metrics using the API or downstream tools, users must be on a dbt Cloud Team or Enterprise plan.

Refer to Billing for more information. -
+import SlFaqs from '/snippets/_sl-faqs.md'; + + ## Next steps diff --git a/website/docs/docs/cloud/about-cloud/regions-ip-addresses.md b/website/docs/docs/cloud/about-cloud/regions-ip-addresses.md index cc1c2531f56..7f32505d56e 100644 --- a/website/docs/docs/cloud/about-cloud/regions-ip-addresses.md +++ b/website/docs/docs/cloud/about-cloud/regions-ip-addresses.md @@ -11,8 +11,8 @@ dbt Cloud is [hosted](/docs/cloud/about-cloud/architecture) in multiple regions | Region | Location | Access URL | IP addresses | Developer plan | Team plan | Enterprise plan | |--------|----------|------------|--------------|----------------|-----------|-----------------| -| North America multi-tenant [^1] | AWS us-east-1 (N. Virginia) | cloud.getdbt.com | 52.45.144.63
54.81.134.249
52.22.161.231 | ✅ | ✅ | ✅ | -| North America Cell 1 [^1] | AWS us-east-1 (N.Virginia) | {account prefix}.us1.dbt.com | [Located in Account Settings](#locating-your-dbt-cloud-ip-addresses) | ❌ | ❌ | ✅ | +| North America multi-tenant [^1] | AWS us-east-1 (N. Virginia) | cloud.getdbt.com | 52.45.144.63
54.81.134.249
52.22.161.231
52.3.77.232
3.214.191.130
34.233.79.135 | ✅ | ✅ | ✅ | +| North America Cell 1 [^1] | AWS us-east-1 (N. Virginia) | {account prefix}.us1.dbt.com | 52.45.144.63
54.81.134.249
52.22.161.231
52.3.77.232
3.214.191.130
34.233.79.135 | ❌ | ❌ | ✅ | | EMEA [^1] | AWS eu-central-1 (Frankfurt) | emea.dbt.com | 3.123.45.39
3.126.140.248
3.72.153.148 | ❌ | ❌ | ✅ | | APAC [^1] | AWS ap-southeast-2 (Sydney)| au.dbt.com | 52.65.89.235
3.106.40.33
13.239.155.206
| ❌ | ❌ | ✅ | | Virtual Private dbt or Single tenant | Customized | Customized | Ask [Support](/community/resources/getting-help#dbt-cloud-support) for your IPs | ❌ | ❌ | ✅ | diff --git a/website/docs/docs/dbt-versions/core-upgrade/01-upgrading-to-v1.6.md b/website/docs/docs/dbt-versions/core-upgrade/01-upgrading-to-v1.6.md index 36146246d3a..33a038baa9b 100644 --- a/website/docs/docs/dbt-versions/core-upgrade/01-upgrading-to-v1.6.md +++ b/website/docs/docs/dbt-versions/core-upgrade/01-upgrading-to-v1.6.md @@ -17,7 +17,7 @@ dbt Core v1.6 has three significant areas of focus: ## Resources - [Changelog](https://github.com/dbt-labs/dbt-core/blob/1.6.latest/CHANGELOG.md) -- [CLI Installation guide](/docs/core/installation-overview +- [dbt Core installation guide](/docs/core/installation-overview) - [Cloud upgrade guide](/docs/dbt-versions/upgrade-core-in-cloud) - [Release schedule](https://github.com/dbt-labs/dbt-core/issues/7481) diff --git a/website/docs/docs/use-dbt-semantic-layer/gsheets.md b/website/docs/docs/use-dbt-semantic-layer/gsheets.md index 9d5a7c105ae..d7525fa7b26 100644 --- a/website/docs/docs/use-dbt-semantic-layer/gsheets.md +++ b/website/docs/docs/use-dbt-semantic-layer/gsheets.md @@ -17,6 +17,8 @@ The dbt Semantic Layer offers a seamless integration with Google Sheets through - You have a Google account with access to Google Sheets. - You can install Google add-ons. - You have a dbt Cloud Environment ID and a [service token](/docs/dbt-cloud-apis/service-tokens) to authenticate with from a dbt Cloud account. +- You must have a dbt Cloud Team or Enterprise [account](https://www.getdbt.com/pricing). Suitable for both Multi-tenant and Single-tenant deployment. + - Single-tenant accounts should contact their account representative for necessary setup and enablement. ## Installing the add-on diff --git a/website/docs/docs/use-dbt-semantic-layer/quickstart-sl.md b/website/docs/docs/use-dbt-semantic-layer/quickstart-sl.md index 84e3227b4e7..62437f4ecd6 100644 --- a/website/docs/docs/use-dbt-semantic-layer/quickstart-sl.md +++ b/website/docs/docs/use-dbt-semantic-layer/quickstart-sl.md @@ -26,7 +26,7 @@ MetricFlow, a powerful component of the dbt Semantic Layer, simplifies the creat Use this guide to fully experience the power of the universal dbt Semantic Layer. Here are the following steps you'll take: - [Create a semantic model](#create-a-semantic-model) in dbt Cloud using MetricFlow -- [Define metrics](#define-metrics) in dbt Cloud using MetricFlow +- [Define metrics](#define-metrics) in dbt using MetricFlow - [Test and query metrics](#test-and-query-metrics) with MetricFlow - [Run a production job](#run-a-production-job) in dbt Cloud - [Set up dbt Semantic Layer](#setup) in dbt Cloud @@ -88,20 +88,9 @@ import SlSetUp from '/snippets/_new-sl-setup.md'; If you're encountering some issues when defining your metrics or setting up the dbt Semantic Layer, check out a list of answers to some of the questions or problems you may be experiencing. -
- How do I migrate from the legacy Semantic Layer to the new one? -
-
If you're using the legacy Semantic Layer, we highly recommend you upgrade your dbt version to dbt v1.6 or higher to use the new dbt Semantic Layer. Refer to the dedicated migration guide for more info.
-
-
-
-How are you storing my data? -User data passes through the Semantic Layer on its way back from the warehouse. dbt Labs ensures security by authenticating through the customer's data warehouse. Currently, we don't cache data for the long term, but it might temporarily stay in the system for up to 10 minutes, usually less. In the future, we'll introduce a caching feature that allows us to cache data on our infrastructure for up to 24 hours. -
-
- Is the dbt Semantic Layer open source? - The dbt Semantic Layer is proprietary; however, some components of the dbt Semantic Layer are open source, such as dbt-core and MetricFlow.

dbt Cloud Developer or dbt Core users can define metrics in their project, including a local dbt Core project, using the dbt Cloud IDE, dbt Cloud CLI, or dbt Core CLI. However, to experience the universal dbt Semantic Layer and access those metrics using the API or downstream tools, users must be on a dbt Cloud Team or Enterprise plan.

Refer to Billing for more information. -
+import SlFaqs from '/snippets/_sl-faqs.md'; + + ## Next steps diff --git a/website/docs/docs/use-dbt-semantic-layer/sl-architecture.md b/website/docs/docs/use-dbt-semantic-layer/sl-architecture.md index 94f8fee007f..9aea2ab42b0 100644 --- a/website/docs/docs/use-dbt-semantic-layer/sl-architecture.md +++ b/website/docs/docs/use-dbt-semantic-layer/sl-architecture.md @@ -14,45 +14,38 @@ The dbt Semantic Layer allows you to define metrics and use various interfaces t - - -## dbt Semantic Layer components +## Components The dbt Semantic Layer includes the following components: | Components | Information | dbt Core users | Developer plans | Team plans | Enterprise plans | License | -| --- | --- | :---: | :---: | :---: | --- | +| --- | --- | :---: | :---: | :---: | :---: | | **[MetricFlow](/docs/build/about-metricflow)** | MetricFlow in dbt allows users to centrally define their semantic models and metrics with YAML specifications. | ✅ | ✅ | ✅ | ✅ | BSL package (code is source available) | -| **MetricFlow Server**| A proprietary server that takes metric requests and generates optimized SQL for the specific data platform. | ❌ | ❌ | ✅ | ✅ | Proprietary, Cloud (Team & Enterprise)| -| **Semantic Layer Gateway** | A service that passes queries to the MetricFlow server and executes the SQL generated by MetricFlow against the data platform|

❌ | ❌ |✅ | ✅ | Proprietary, Cloud (Team & Enterprise) | -| **Semantic Layer APIs** | The interfaces allow users to submit metric queries using GraphQL and JDBC APIs. They also serve as the foundation for building first-class integrations with various tools. | ❌ | ❌ | ✅ | ✅ | Proprietary, Cloud (Team & Enterprise)| +| **dbt Semantic interfaces**| A configuration spec for defining metrics, dimensions, how they link to each other, and how to query them. The [dbt-semantic-interfaces](https://github.com/dbt-labs/dbt-semantic-interfaces) is available under Apache 2.0. | ❌ | ❌ | ✅ | ✅ | Proprietary, Cloud (Team & Enterprise)| +| **Service layer** | Coordinates query requests and dispatching the relevant metric query to the target query engine. This is provided through dbt Cloud and is available to all users on dbt version 1.6 or later. The service layer includes a Gateway service for executing SQL against the data platform. | ❌ | ❌ | ✅ | ✅ | Proprietary, Cloud (Team & Enterprise) | +| **[Semantic Layer APIs](/docs/dbt-cloud-apis/sl-api-overview)** | The interfaces allow users to submit metric queries using GraphQL and JDBC APIs. They also serve as the foundation for building first-class integrations with various tools. | ❌ | ❌ | ✅ | ✅ | Proprietary, Cloud (Team & Enterprise)| -## Related questions +## Feature comparison -
- How do I migrate from the legacy Semantic Layer to the new one? -
-
If you're using the legacy Semantic Layer, we highly recommend you upgrade your dbt version to dbt v1.6 or higher to use the new dbt Semantic Layer. Refer to the dedicated migration guide for more info.
-
-
- -
-How are you storing my data? -User data passes through the Semantic Layer on its way back from the warehouse. dbt Labs ensures security by authenticating through the customer's data warehouse. Currently, we don't cache data for the long term, but it might temporarily stay in the system for up to 10 minutes, usually less. In the future, we'll introduce a caching feature that allows us to cache data on our infrastructure for up to 24 hours. -
-
- Is the dbt Semantic Layer open source? -The dbt Semantic Layer is proprietary; however, some components of the dbt Semantic Layer are open source, such as dbt-core and MetricFlow.

dbt Cloud Developer or dbt Core users can define metrics in their project, including a local dbt Core project, using the dbt Cloud IDE, dbt Cloud CLI, or dbt Core CLI. However, to experience the universal dbt Semantic Layer and access those metrics using the API or downstream tools, users must be on a dbt Cloud Team or Enterprise plan.

Refer to Billing for more information. -
-
- Is there a dbt Semantic Layer discussion hub? -
-
Yes absolutely! Join the dbt Slack community and #dbt-cloud-semantic-layer slack channel for all things related to the dbt Semantic Layer. -
-
-
+The following table compares the features available in dbt Cloud and source available in dbt Core: + +| Feature | MetricFlow Source available | dbt Semantic Layer with dbt Cloud | +| ----- | :------: | :------: | +| Define metrics and semantic models in dbt using the MetricFlow spec | ✅ | ✅ | +| Generate SQL from a set of config files | ✅ | ✅ | +| Query metrics and dimensions through the command line interface (CLI) | ✅ | ✅ | +| Query dimension, entity, and metric metadata through the CLI | ✅ | ✅ | +| Query metrics and dimensions through semantic APIs (ADBC, GQL) | ❌ | ✅ | +| Connect to downstream integrations (Tableau, Hex, Mode, Google Sheets, and so on.) | ❌ | ✅ | +| Create and run Exports to save metrics queries as tables in your data platform. | ❌ | Coming soon | + +## FAQs + +import SlFaqs from '/snippets/_sl-faqs.md'; + + diff --git a/website/docs/docs/use-dbt-semantic-layer/tableau.md b/website/docs/docs/use-dbt-semantic-layer/tableau.md index a5c1b6edd04..0f12a75f468 100644 --- a/website/docs/docs/use-dbt-semantic-layer/tableau.md +++ b/website/docs/docs/use-dbt-semantic-layer/tableau.md @@ -21,7 +21,8 @@ This integration provides a live connection to the dbt Semantic Layer through Ta - Note that Tableau Online does not currently support custom connectors natively. If you use Tableau Online, you will only be able to access the connector in Tableau Desktop. - Log in to Tableau Desktop (with Online or Server credentials) or a license to Tableau Server - You need your dbt Cloud host, [Environment ID](/docs/use-dbt-semantic-layer/setup-sl#set-up-dbt-semantic-layer) and [service token](/docs/dbt-cloud-apis/service-tokens) to log in. This account should be set up with the dbt Semantic Layer. -- You must have a dbt Cloud Team or Enterprise [account](https://www.getdbt.com/pricing) and multi-tenant [deployment](/docs/cloud/about-cloud/regions-ip-addresses). (Single-Tenant coming soon) +- You must have a dbt Cloud Team or Enterprise [account](https://www.getdbt.com/pricing). Suitable for both Multi-tenant and Single-tenant deployment. + - Single-tenant accounts should contact their account representative for necessary setup and enablement. ## Installing the Connector diff --git a/website/docs/faqs/Tests/testing-sources.md b/website/docs/faqs/Tests/testing-sources.md index 8eb769026e5..5e68b88dcbf 100644 --- a/website/docs/faqs/Tests/testing-sources.md +++ b/website/docs/faqs/Tests/testing-sources.md @@ -9,7 +9,7 @@ id: testing-sources To run tests on all sources, use the following command: ```shell -$ dbt test --select source:* + dbt test --select "source:*" ``` (You can also use the `-s` shorthand here instead of `--select`) diff --git a/website/snippets/_new-sl-setup.md b/website/snippets/_new-sl-setup.md index 3cb6e09eb4c..18e75c3278d 100644 --- a/website/snippets/_new-sl-setup.md +++ b/website/snippets/_new-sl-setup.md @@ -1,6 +1,7 @@ You can set up the dbt Semantic Layer in dbt Cloud at the environment and project level. Before you begin: -- You must have a dbt Cloud Team or Enterprise [multi-tenant](/docs/cloud/about-cloud/regions-ip-addresses) deployment. Single-tenant coming soon. +- You must have a dbt Cloud Team or Enterprise account. Suitable for both Multi-tenant and Single-tenant deployment. + - Single-tenant accounts should contact their account representative for necessary setup and enablement. - You must be part of the Owner group, and have the correct [license](/docs/cloud/manage-access/seats-and-users) and [permissions](/docs/cloud/manage-access/self-service-permissions) to configure the Semantic Layer: * Enterprise plan — Developer license with Account Admin permissions. Or Owner with a Developer license, assigned Project Creator, Database Admin, or Admin permissions. * Team plan — Owner with a Developer license. diff --git a/website/snippets/_sl-connect-and-query-api.md b/website/snippets/_sl-connect-and-query-api.md index 429f41c3bf6..f7f1d2add24 100644 --- a/website/snippets/_sl-connect-and-query-api.md +++ b/website/snippets/_sl-connect-and-query-api.md @@ -1,10 +1,8 @@ You can query your metrics in a JDBC-enabled tool or use existing first-class integrations with the dbt Semantic Layer. -You must have a dbt Cloud Team or Enterprise [multi-tenant](/docs/cloud/about-cloud/regions-ip-addresses) deployment. Single-tenant coming soon. - +- You must have a dbt Cloud Team or Enterprise account. Suitable for both Multi-tenant and Single-tenant deployment. + - Single-tenant accounts should contact their account representative for necessary setup and enablement. - To learn how to use the JDBC or GraphQL API and what tools you can query it with, refer to [dbt Semantic Layer APIs](/docs/dbt-cloud-apis/sl-api-overview). - * To authenticate, you need to [generate a service token](/docs/dbt-cloud-apis/service-tokens) with Semantic Layer Only and Metadata Only permissions. * Refer to the [SQL query syntax](/docs/dbt-cloud-apis/sl-jdbc#querying-the-api-for-metric-metadata) to query metrics using the API. - - To learn more about the sophisticated integrations that connect to the dbt Semantic Layer, refer to [Available integrations](/docs/use-dbt-semantic-layer/avail-sl-integrations) for more info. diff --git a/website/snippets/_sl-faqs.md b/website/snippets/_sl-faqs.md new file mode 100644 index 00000000000..5bc556ae00a --- /dev/null +++ b/website/snippets/_sl-faqs.md @@ -0,0 +1,28 @@ +- **Is the dbt Semantic Layer open source?** + - The dbt Semantic Layer is proprietary; however, some components of the dbt Semantic Layer are open source, such as dbt-core and MetricFlow. + + dbt Cloud Developer or dbt Core users can define metrics in their project, including a local dbt Core project, using the dbt Cloud IDE, dbt Cloud CLI, or dbt Core CLI. However, to experience the universal dbt Semantic Layer and access those metrics using the API or downstream tools, users must be on a dbt Cloud [Team or Enterprise](https://www.getdbt.com/pricing/) plan. + + Refer to [Billing](https://docs.getdbt.com/docs/cloud/billing) for more information. + +- **How can open-source users use the dbt Semantic Layer?** + - The dbt Semantic Layer requires the use of the dbt Cloud-provided service for coordinating query requests. Open source users who don’t use dbt Cloud can currently work around the lack of a service layer. They can do this by running `mf query --explain` in the command line. This command generates SQL code, which they can then use in their current systems for running and managing queries. + + As we refine MetricFlow’s API layers, some users may find it easier to set up their own custom service layers for managing query requests. This is not currently recommended, as the API boundaries around MetricFlow are not sufficiently well-defined for broad-based community use + +- **Can I reference MetricFlow queries inside dbt models?** + - dbt relies on Jinja macros to compile SQL, while MetricFlow is Python-based and does direct SQL rendering targeting at a specific dialect. MetricFlow does not support pass-through rendering of Jinja macros, so we can’t easily reference MetricFlow queries inside of dbt models. + + Beyond the technical challenges that could be overcome, we see Metrics as the leaf node of your DAG, and a place for users to consume metrics. If you need to do additional transformation on top of a metric, this is usually a sign that there is more modeling that needs to be done. + +- **Can I create tables in my data platform using MetricFlow?** + - You can use the upcoming feature, Exports, which will allow you to create a [pre-defined](/docs/build/saved-queries) MetricFlow query as a table in your data platform. This feature will be available to dbt Cloud customers only. This is because MetricFlow is primarily for query rendering while dispatching the relevant query and performing any DDL is the domain of the service layer on top of MetricFlow. + +- **How do I migrate from the legacy Semantic Layer to the new one?** + - If you're using the legacy Semantic Layer, we highly recommend you [upgrade your dbt version](/docs/dbt-versions/upgrade-core-in-cloud) to dbt v1.6 or higher to use the new dbt Semantic Layer. Refer to the dedicated [migration guide](/guides/sl-migration) for more info. + +- **How are you storing my data?** + - User data passes through the Semantic Layer on its way back from the warehouse. dbt Labs ensures security by authenticating through the customer's data warehouse. Currently, we don't cache data for the long term, but it might temporarily stay in the system for up to 10 minutes, usually less. In the future, we'll introduce a caching feature that allows us to cache data on our infrastructure for up to 24 hours. + +- **Is there a dbt Semantic Layer discussion hub?** + - Yes absolutely! Join the [dbt Slack community](https://getdbt.slack.com) and [#dbt-cloud-semantic-layer slack channel](https://getdbt.slack.com/archives/C046L0VTVR6) for all things related to the dbt Semantic Layer. diff --git a/website/snippets/_sl-plan-info.md b/website/snippets/_sl-plan-info.md index 083ab2209bc..fe4e6024226 100644 --- a/website/snippets/_sl-plan-info.md +++ b/website/snippets/_sl-plan-info.md @@ -1,2 +1,2 @@ -To define and query metrics with the {props.product}, you must be on a {props.plan} multi-tenant plan .


+To define and query metrics with the {props.product}, you must be on a {props.plan} account. Suitable for both Multi-tenant and Single-tenant accounts. Note: Single-tenant accounts should contact their account representative for necessary setup and enablement.

diff --git a/website/snippets/_v2-sl-prerequisites.md b/website/snippets/_v2-sl-prerequisites.md index c80db4d1c8f..6a9babcf0e0 100644 --- a/website/snippets/_v2-sl-prerequisites.md +++ b/website/snippets/_v2-sl-prerequisites.md @@ -1,15 +1,16 @@ -- Have a dbt Cloud Team or Enterprise [multi-tenant](/docs/cloud/about-cloud/regions-ip-addresses) deployment. Single-tenant coming soon. -- Have both your production and development environments running dbt version 1.6 or higher. Refer to [upgrade in dbt Cloud](/docs/dbt-versions/upgrade-core-in-cloud) for more info. +- Have a dbt Cloud Team or Enterprise account. Suitable for both Multi-tenant and Single-tenant deploymnet. + - Note: Single-tenant accounts should contact their account representative for necessary setup and enablement. +- Have both your production and development environments running [dbt version 1.6 or higher](/docs/dbt-versions/upgrade-core-in-cloud). - Use Snowflake, BigQuery, Databricks, or Redshift. - Create a successful run in the environment where you configure the Semantic Layer. - **Note:** Semantic Layer currently supports the Deployment environment for querying. (_development querying experience coming soon_) - Set up the [Semantic Layer API](/docs/dbt-cloud-apis/sl-api-overview) in the integrated tool to import metric definitions. - - To access the API and query metrics in downstream tools, you must have a dbt Cloud [Team or Enterprise](https://www.getdbt.com/pricing/) account. dbt Core or Developer accounts can define metrics but won't be able to dynamically query them.
+ - dbt Core or Developer accounts can define metrics but won't be able to dynamically query them.
- Understand [MetricFlow's](/docs/build/about-metricflow) key concepts, which powers the latest dbt Semantic Layer. -- Note that SSH tunneling for [Postgres and Redshift](/docs/cloud/connect-data-platform/connect-redshift-postgresql-alloydb) connections, [PrivateLink](/docs/cloud/secure/about-privatelink), and [Single sign-on (SSO)](/docs/cloud/manage-access/sso-overview) isn't supported yet. +- Note that SSH tunneling for [Postgres and Redshift](/docs/cloud/connect-data-platform/connect-redshift-postgresql-alloydb) connections, [PrivateLink](/docs/cloud/secure/about-privatelink), and [Single sign-on (SSO)](/docs/cloud/manage-access/sso-overview) doesn't supported the dbt Semantic Layer yet.
diff --git a/website/src/components/communitySpotlightCard/index.js b/website/src/components/communitySpotlightCard/index.js index 08707a93dd4..122edee8f06 100644 --- a/website/src/components/communitySpotlightCard/index.js +++ b/website/src/components/communitySpotlightCard/index.js @@ -1,5 +1,6 @@ import React from 'react' import Link from '@docusaurus/Link'; +import Head from "@docusaurus/Head"; import styles from './styles.module.css'; import imageCacheWrapper from '../../../functions/image-cache-wrapper'; @@ -47,24 +48,45 @@ function CommunitySpotlightCard({ frontMatter, isSpotlightMember = false }) { jobTitle, companyName, organization, - socialLinks + socialLinks, + communityAward } = frontMatter - return ( - + // Get meta description text + const metaDescription = stripHtml(description) + + return ( + + {isSpotlightMember && metaDescription ? ( + + + + + ) : null} + {communityAward ? ( +
+ Community Award Recipient +
+ ) : null} {image && (
{id && isSpotlightMember ? ( - {title} + {title} ) : ( - - {title} + + {title} )}
@@ -72,19 +94,26 @@ function CommunitySpotlightCard({ frontMatter, isSpotlightMember = false }) {
{!isSpotlightMember && id ? (

- {title} + + {title} +

- ) : ( + ) : (

{title}

)} - {pronouns &&
{pronouns}
} - + {pronouns && ( +
{pronouns}
+ )} + {isSpotlightMember && (
{(jobTitle || companyName) && (
{jobTitle && jobTitle} - {jobTitle && companyName && ', '} + {jobTitle && companyName && ", "} {companyName && companyName}
)} @@ -101,7 +130,10 @@ function CommunitySpotlightCard({ frontMatter, isSpotlightMember = false }) {
)} {description && !isSpotlightMember && ( -

+

)} {socialLinks && isSpotlightMember && socialLinks?.length > 0 && (

@@ -109,8 +141,15 @@ function CommunitySpotlightCard({ frontMatter, isSpotlightMember = false }) { <> {item?.name && item?.link && ( <> - {i !== 0 && ' | '} - {item.name} + {i !== 0 && " | "} + + {item.name} + )} @@ -118,29 +157,33 @@ function CommunitySpotlightCard({ frontMatter, isSpotlightMember = false }) {
)} {id && !isSpotlightMember && ( - Read More + > + Read More + )}
{description && isSpotlightMember && (

About

-

- +

)}
- ) + ); } -// Truncate text +// Truncate description text for community member cards function truncateText(str) { // Max length of string let maxLength = 300 - // Check if anchor link starts within first 300 characters + // Check if anchor link starts within maxLength let hasLinks = false if(str.substring(0, maxLength - 3).match(/(?:]+)>)/gi, "") + + // Strip new lines and return 130 character substring for description + const updatedDesc = strippedHtml + ?.substring(0, maxLength) + ?.replace(/(\r\n|\r|\n)/g, ""); + + return desc?.length > maxLength ? `${updatedDesc}...` : updatedDesc +} + export default CommunitySpotlightCard diff --git a/website/src/components/communitySpotlightCard/styles.module.css b/website/src/components/communitySpotlightCard/styles.module.css index 253a561ebea..5df85c8a4cc 100644 --- a/website/src/components/communitySpotlightCard/styles.module.css +++ b/website/src/components/communitySpotlightCard/styles.module.css @@ -15,6 +15,19 @@ header.spotlightMemberCard { div.spotlightMemberCard { margin-bottom: 2.5rem; } +.spotlightMemberCard .awardBadge { + flex: 0 0 100%; + margin-bottom: .5rem; +} +.spotlightMemberCard .awardBadge span { + max-width: fit-content; + color: #fff; + background: var(--ifm-color-primary); + display: block; + border-radius: 1rem; + padding: 5px 10px; + font-size: .7rem; +} .spotlightMemberCard .spotlightMemberImgContainer { flex: 0 0 100%; } @@ -81,6 +94,9 @@ div.spotlightMemberCard { margin-bottom: 0; padding-left: 0; } + .spotlightMemberCard .awardBadge span { + font-size: .8rem; + } .spotlightMemberCard .spotlightMemberImgContainer { flex: 0 0 346px; margin-right: 2rem; @@ -100,7 +116,3 @@ div.spotlightMemberCard { line-height: 2rem; } } - - - - diff --git a/website/src/components/communitySpotlightList/index.js b/website/src/components/communitySpotlightList/index.js index 6885f5ff2ac..ed0dbf6d653 100644 --- a/website/src/components/communitySpotlightList/index.js +++ b/website/src/components/communitySpotlightList/index.js @@ -36,6 +36,7 @@ function CommunitySpotlightList({ spotlightData }) { {metaTitle} +