... apart from the performance impact?
In our situation, the data volumes will be overseeable, the complexity may not.
Background:
I have experienced a project where the datamarts were fully loaded each day.
This simplified the ETL because no delta processing needed to be done. For the same reason, performance was also acceptable. However I am not sure if this scenario is always usable, if there are downsides e.g. when an end user tells us the data is 'wrong' since 4 days - it would be difficult to trace that back.
Thanks for your input
Depending on the business requirements, it is a perfectly acceptable strategy.
One thing you will lose is the ability to show the history of slowly changing dimensions. If this is not important to your business, don't worry about it.
At Ajilius we have a customer in the high fashion industry who reloads their data warehouse on an hourly basis. This is because of the need to show near real-time visualisation of their product planning cycle, which is seasonal, any data can change at any time, and has no long term history requirement.
A more common case is where there is no change data capture capability in the source DBMS. You'll often see full reloads of facts and dimensions in this circumstance. Facts are less likely - you usually have a date or timestamp to govern an extract - but full dimension reloads happen quite often.