I am trying to provide an analysis of the number of lapsed and retained customers per month on a bar graph.
There are two tables in play -
The two tables have an established relationship based on the respective date columns. This is what the graph looks and slicer looks like -
The lapsed customer count comes from the formula given below
Retention Analysis - Num Lapsed =
VAR Before6 =
CALCULATETABLE (
VALUES ('Master File'[UniqueIDFinal]),
PARALLELPERIOD ( 'Calendar Table'[Date], -3, MONTH )
)
VAR During6 =
CALCULATETABLE (
VALUES ('Master File'[UniqueIDFinal]),
DATESINPERIOD (
'Calendar Table'[Date],
EOMONTH ( MIN ('Master File'[Sale Date II]), 0 ),
-3,
MONTH
)
)
RETURN
CALCULATE (
DISTINCTCOUNT ('Master File'[UniqueIDFinal]),
FILTER (
ALL ('Master File'),
'Master File'[UniqueIDFinal] IN Before6
&& NOT 'Master File'[UniqueIDFinal] IN During6
)
)*-1
The Retained customer count is a subtraction of the Lapsed customer count from the number of customers served in a month.
The Lapsed customers only begin to show 3 months after the start of the date range in the slicer, which is how the formula is supposed to work.
Is there a way I can show the Retained customer count to show 3 months after the start of the date range as well?
I realise that both formulae are working as expected. However, I would like the Retained customer count to show 3 months after the start of the date range as well as the Retained count should only come into play once customers start lapsing (3 months after the start of the date range as per the Lapsed formula).
Thanks in advance.
Just as a best practice, your slicer should be manipulating the calendar dimension table.
To this actual question though, you could put your retained measure inside an IF() statement:
IF(
ISBLANK( [Retention Analysis - Num Lapsed] ),
BLANK(),
/*current measure*/
)