Skip to content

Commit 9fec70b

Browse files
committed
add _learn
1 parent b46444c commit 9fec70b

755 files changed

Lines changed: 15601 additions & 0 deletions

File tree

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.
Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,4 @@
1+
---
2+
layout: learn_subcategory
3+
is_track_index: true
4+
---
Lines changed: 25 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,25 @@
1+
---
2+
layout: learn_article
3+
date: 2025-02-28
4+
image: /_learn/images/cheat-sheets/powerbi-to-metabase-cheat-sheet.png
5+
categories: Cheat Sheets
6+
author: The Metabase Team
7+
---
8+
<a href="/files/cheatsheets/metabase-for-powerbi-users.pdf" target="_blank">
9+
<img
10+
src="../../images/cheat-sheets/powerbi-to-metabase-cheat-sheet.png"
11+
alt="Power BI to Metabase cheat sheet"
12+
style="max-width: 100%; height: auto;"
13+
/>
14+
</a>
15+
16+
<div style="text-align: left; margin-top: 20px;">
17+
<a
18+
href="/files/cheatsheets/metabase-for-powerbi-users.pdf"
19+
class="Button btn btn-primary"
20+
style="padding: 10px 20px; font-size: 16px;"
21+
download
22+
>
23+
Download PDF
24+
</a>
25+
</div>
Lines changed: 26 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,26 @@
1+
---
2+
layout: learn_article
3+
date: 2024-12-03
4+
image: /_learn/images/cheat-sheets/tableau-to-metabase-cheat-sheet.png
5+
categories: Cheat Sheets
6+
author: The Metabase Team
7+
---
8+
9+
<a href="/files/cheatsheets/metabase-for-tableau-users.pdf" target="_blank">
10+
<img
11+
src="../../images/cheat-sheets/tableau-to-metabase-cheat-sheet.png"
12+
alt="Tableau to Metabase cheat sheet"
13+
style="max-width: 100%; height: auto;"
14+
/>
15+
</a>
16+
17+
<div style="text-align: left; margin-top: 20px;">
18+
<a
19+
href="/files/cheatsheets/metabase-for-tableau-users.pdf"
20+
class="Button btn btn-primary"
21+
style="padding: 10px 20px; font-size: 16px;"
22+
download
23+
>
24+
Download PDF
25+
</a>
26+
</div>
Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,6 @@
1+
---
2+
layout: learn_subcategory
3+
is_track_index: true
4+
redirect_from:
5+
- /learn/analytics
6+
---
Lines changed: 58 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,58 @@
1+
---
2+
layout: learn_article
3+
date: 2015-12-22 16:10:58
4+
categories: jekyll update
5+
image: /images/twitter/default.png
6+
author: Sameer Al-Sakran
7+
redirect_from:
8+
- /learn/data-diet/analytics/analytics-mistakes
9+
- /learn/analytics/analytics-mistakes
10+
---
11+
12+
This article covers ten common mistakes startups make when applying data to decision making. These are situations that many well-intentioned people will find themselves in, and the goal in going over these is not to get down on yourself but to reinforce the need to be self aware in your (and your organization's) decision making processes, and constantly work on improving. We cover more on decision-making in [Managing your information budget](information-budget).
13+
14+
## 1. Mixing up correlation and causation
15+
16+
Yes, you already know about it, but it's still really easy to fall for. Mixing up correlation and causation is especially dangerous when [exploring historical data](/learn/metabase-basics/querying-and-dashboards/time-series/start), or otherwise not having a clear hypothesis to falsify. It's best to treat any common patterns that happened in the past as _suggestive_ of causation, rather than as a cause until proven otherwise.
17+
18+
## 2. Expecting data to give you answers to questions you can't formulate
19+
20+
Too many companies think they can just collect data, use the latest trendy technologies, and hire expensive data scientists/analysts/MBAs to figure out their business. The reality is that the quality of your business intelligence is directly proportional to how well your organization can articulate the questions it needs to answer. More data and talented data analysts can supercharge an organization that has a clear decision and product process, but doubling down on big data won't be the miracle that saves an organization that lacks focus.
21+
22+
## 3. Looking for data to support a decision you already made
23+
24+
It's common to go through the motions of collecting data, analyzing it, and coming to a decision when you (or others on the team) have already made up your mind. Instead, you should formulate hypothesis, see if you can falsify them, and update your perspective when the data goes the other way.
25+
26+
## 4. Fishing for the positive
27+
28+
A subset of looking for data to support a decision: looking for data to support a rosy picture. There's always something that is trending upwards, even in terminally ill companies. If all the metrics you think are important are going south, avoid the temptation to seek out metrics that tell a sunnier story.
29+
30+
## 5. Expecting too much clarity in results
31+
32+
Even if you've seen plenty of action movies, when you watch an _actual_ boxing match, you might not be able to make much sense of it. If you're used to seeing choreographed fight sequences, shot from perfect angles with perfect lighting and editing, the chaos and speed of real-life fighting can be bewildering.
33+
34+
The same goes for people accustomed to MBA coursework or highly idealized blog posts when they encounter quantitative decision-making in real life. In the real world, effects can be small, messy, and multimodal (we'll talk about pitfalls of averages below). You'll need to work with the data you have, not the perfect data you imagine.
35+
36+
## 6. Expecting to A/B test your way to success
37+
38+
While carefully planned, well-run A/B tests can be transformative to a company, they also often lead to chasing one's tail. Make sure you know what a significant result is _before_ you start an A/B test. Don't stop the test the instant one of the options seems to be performing better, and always include a control group. And the smaller the effect, the larger the number of users you'll need. If you only have 10k monthly active users, you would be better off simply delaying any kind of A/B testing until you have more people you can test against.
39+
40+
Furthermore, A/B testing won't determine the best product features or advertising copy for you. The results are only as good as the options that you test, and the results are very sensitive to how good the initial design is. Don't let "we'll A/B test that" become a mantra that shuts down the process of deciding what your product actually is. A/B tests are best used to add that last bit of polish.
41+
42+
## 7. Using the wrong time period
43+
44+
If your customers purchase on a multi-month time frame, and your product cycle moves in two-week sprints, you don't need real-time analytics. Likewise, if you're trying to diagnose errors in network operations where the cost of being down is measured in tens of millions a minute, you better not be looking at hourly charts. It's important to tie the reporting time period with the natural time period of your decision-making. If you're looking at your data at too fine a time period, you'll end up being twitchy and thrashing between decisions. If you're using too large a time period, you'll forever be three moves behind.
45+
46+
## 8. Only looking at averages
47+
48+
Averages are a great place to hide uncomfortable truths. If you only use blended averages across organic and paid channels, you might end up ignoring the fact that your paid acquisition channels are becoming unsustainably expensive. If you look at average latency across all your web pages, you might not notice that your most important pages are getting slower over time. As a rule, when averages tell you something is getting worse, it's time to worry. When averages tell you things are looking good, it's time to dig deeper.
49+
50+
Most average-inspired delusions go away when you break the data out into a [histogram](/learn/visualization/histograms). For example, rosy projections regarding the average cost of customer acquisition can disappear when you break the costs out by channel.
51+
52+
## 9. Focusing on totals instead of the rate of change
53+
54+
Everyone loves charts that go up and to the right. "Total number of signups", "Cumulative revenue", and "Total value of goods sold" can make for good press, but for most situations, you should be looking at the [rate of change](/learn/metabase-basics/querying-and-dashboards/time-series/time-series-comparisons), and possibly even the growth in that rate. If 95% of the information a metric carries relates to events that happened months or years ago, does that help you evaluate how you're doing today? Or how things will look tomorrow?
55+
56+
## 10. Not evaluating the results of a decision
57+
58+
It's common to want to collect lots of information before making a big decision. However, once you make the decision, and the results start to trickle in, it's common to just assume things are going well. Bad decisions are inevitable, but it's the bad decisions you don't accept and correct that end up hurting you the most. It's better to understand that you made the wrong call right away than it is find out you've been wrong for months.
Lines changed: 39 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,39 @@
1+
---
2+
layout: learn_article
3+
date: 2022-01-17
4+
categories: guide
5+
image: /images/twitter/default.png
6+
author: The Metabase Team
7+
redirect_from:
8+
- /learn/analytics/avoiding-data-jargon
9+
---
10+
11+
It's easy to fall into the trap of using the word "data" and other analytics jargon as ambiguous placeholders for what you really mean, like a dumping ground for miscellaneous information you haven't quite figured out how to articulate.
12+
13+
"Data" means different things to different audiences. Your engineering, legal, and marketing colleagues all have different ideas of what data is, and they're all correct. To cut through this confusion, it's important to be specific when talking about the fields and rows that make up your databases and the information you need to validate decisions. Sticking with straightforward and precise language will help your team form a clearer idea of what information you already have, what questions you're looking to answer, and how answering those questions will help people make progress toward your goals. And the better people understand your new analytics strategy, the more interested they will be, and the more they'll contribute to it.
14+
15+
There's a certain irony to the guidance we're providing — yes, this is all _broad_ advice about how you should be _specific_ when figuring out how analytics should work at your organization, but stay with us here. And no, you probably won't stop using the word "data" entirely, but you can try to avoid ambiguous analytics corporate-speak whenever possible.
16+
17+
## Create specific language and shared definitions
18+
19+
Most people at your organization have a strong mental model for how the "data" all fits together, even if they wouldn't think so themselves. They draw on that mental model whenever they do their job, whether that's in sales, marketing, or another department. Mapping these mental models of the business to information that you can analyze is a fundamental step of getting analytics up and running, like figuring out how the organization defines "customers," what qualifies as an active customer, a returning customer, a good customer, or a churned customer. Going from thinking about how your business works to building instrumentation that validates that story is a challenge for any growing organization, so try to think about how people can translate these mental models into data points that can be captured and scrutinized; literally, what is it that you want to count, average, or sum, and how?
20+
21+
Working across departments to develop these shared definitions gives people some stake in the process, and ensures that analytics at your organization [stay organized](/learn/metabase-basics/administration/administration-and-operation/same-page). And in the future when you talk about active customers, the people around you will know what that term means, and can use your analysis of what those active customers are doing to inform their decisions.
22+
23+
Simply put, people will get behind your BI strategy if they understand what you're talking about.
24+
25+
## Don't lean on "data" or other analytics jargon, articulate what it is you really mean
26+
27+
If you find yourself talking about unspecific "data," take that as a signal that you may be confused, and need to clarify your strategy. Your organization already collects some kind of information, and ambiguous language like "insights," "analysis," and "data" all correlate to some real-life action or meaning. Identifying the meaning behind the information you already have — what you already collect — is a good starting point. For example, "data" could mean page views, deals, contracts, likes, installs, or people. When you talk about "analysis," be specific about the operations involved, like averaging, grouping by date, filtering, or comparing different time periods. If you know these meanings, use them! These unclear buzzwords always refers to _something_, so talk about that something.
28+
29+
Once you've identified what you already have, keep up that specificity when thinking about what else you'd like to capture, if anything. Here's an example:
30+
31+
**Vague**: "Let's collect more data about the success of our customer trials."
32+
33+
**Better**: "In the last 30 days, what percentage of customers who started with a free trial converted to paying for our service? For those that didn't, did they complete their profiles? Did they contact us, and did we respond within the timeframe listed in our service agreement? What issues did customers contact us about the most?"
34+
35+
This second example involves asking more questions — precise ones — and pinpoints exactly what new information you're looking for and what problems having it would solve. This specificity is part of effective storytelling, whether you're proposing or developing a new system, building useful dashboards, or presenting findings. You'll need to think critically about the current market and the problems your business is trying to solve, but sharpening these skills is foundational to crafting a strong business intelligence strategy.
36+
37+
## Prefer technical vocabulary to buzzwords
38+
39+
You don't need to discard every BI-related word in your vocabulary. Phrases and acronyms like [data warehouse](/glossary/data_warehouse), [ETL](/glossary/etl), and [OLAP](/glossary/olap) can be confusing to a newcomer, but they do have specific, industry-wide definitions. Now that doesn't mean you should sprinkle confusing vocabulary into a company-wide presentation without explanation, but use them if you have to. And if you find that some knowledge of this vocabulary is fundamental to getting everyone onboard with the analytics strategy, offering low-stakes training on concepts like [segments](/glossary/segment), [measures](/glossary/measure), [dimensions](/glossary/dimension), or [metadata](/glossary/metadata) could be helpful for your team.

0 commit comments

Comments
 (0)