I have been looking at all questions/answers about to how drop consecutive duplicates selectively in a pandas dataframe, still cannot figure out the following scenario:
import pandas as pd
import numpy as np
def random_dates(start, end, n, freq, seed=None):
if seed is not None:
np.random.seed(seed)
dr = pd.date_range(start, end, freq=freq)
return pd.to_datetime(np.sort(np.random.choice(dr, n, replace=False)))
date = random_dates('2018-01-01', '2018-01-12', 20, 'H', seed=[3, 1415])
data = {'Timestamp': date,
'Message': ['Message received.','Sending...', 'Sending...', 'Sending...', 'Work in progress...', 'Work in progress...',
'Message received.','Sending...', 'Sending...','Work in progress...',
'Message received.','Sending...', 'Sending...', 'Sending...','Work in progress...', 'Work in progress...', 'Work in progress...',
'Message received.','Sending...', 'Sending...']}
df = pd.DataFrame(data, columns = ['Timestamp', 'Message'])
I have the following dataframe:
Timestamp Message
0 2018-01-02 03:00:00 Message received.
1 2018-01-02 11:00:00 Sending...
2 2018-01-03 04:00:00 Sending...
3 2018-01-04 11:00:00 Sending...
4 2018-01-04 16:00:00 Work in progress...
5 2018-01-04 17:00:00 Work in progress...
6 2018-01-05 05:00:00 Message received.
7 2018-01-05 11:00:00 Sending...
8 2018-01-05 17:00:00 Sending...
9 2018-01-06 02:00:00 Work in progress...
10 2018-01-06 14:00:00 Message received.
11 2018-01-07 07:00:00 Sending...
12 2018-01-07 20:00:00 Sending...
13 2018-01-08 01:00:00 Sending...
14 2018-01-08 02:00:00 Work in progress...
15 2018-01-08 15:00:00 Work in progress...
16 2018-01-09 00:00:00 Work in progress...
17 2018-01-10 03:00:00 Message received.
18 2018-01-10 09:00:00 Sending...
19 2018-01-10 14:00:00 Sending...
I want to drop the consecutive duplicates in df['Message'] column ONLY when 'Message' is 'Work in progress...' and keep the first instance (here e.g. Index 5, 15 and 16 need to be dropped), ideally I would like to get:
Timestamp Message
0 2018-01-02 03:00:00 Message received.
1 2018-01-02 11:00:00 Sending...
2 2018-01-03 04:00:00 Sending...
3 2018-01-04 11:00:00 Sending...
4 2018-01-04 16:00:00 Work in progress...
6 2018-01-05 05:00:00 Message received.
7 2018-01-05 11:00:00 Sending...
8 2018-01-05 17:00:00 Sending...
9 2018-01-06 02:00:00 Work in progress...
10 2018-01-06 14:00:00 Message received.
11 2018-01-07 07:00:00 Sending...
12 2018-01-07 20:00:00 Sending...
13 2018-01-08 01:00:00 Sending...
14 2018-01-08 02:00:00 Work in progress...
17 2018-01-10 03:00:00 Message received.
18 2018-01-10 09:00:00 Sending...
19 2018-01-10 14:00:00 Sending...
I have tried solutions offered in similar posts like:
df['Message'].loc[df['Message'].shift(-1) != df['Message']]
I also calculated the length of the Messages:
df['length'] = df['Message'].apply(lambda x: len(x))
and wrote a conditional drop as:
df.loc[(df['length'] ==17) | (df['length'] ==10) | ~df['Message'].duplicated(keep='first')]
It looks better but still Index 14, 15, and 16 are dropped altogether, thus it is ill-behaved, see:
Timestamp Message length
0 2018-01-02 03:00:00 Message received. 17
1 2018-01-02 11:00:00 Sending... 10
2 2018-01-03 04:00:00 Sending... 10
3 2018-01-04 11:00:00 Sending... 10
4 2018-01-04 16:00:00 Work in progress... 19
6 2018-01-05 05:00:00 Message received. 17
7 2018-01-05 11:00:00 Sending... 10
8 2018-01-05 17:00:00 Sending... 10
10 2018-01-06 14:00:00 Message received. 17
11 2018-01-07 07:00:00 Sending... 10
12 2018-01-07 20:00:00 Sending... 10
13 2018-01-08 01:00:00 Sending... 10
17 2018-01-10 03:00:00 Message received. 17
18 2018-01-10 09:00:00 Sending... 10
19 2018-01-10 14:00:00 Sending... 10
Your time and help is appreciated!
First filter first consecutive values with compare by Series.shift
and chain mask with filter all rows with no Work in progress...
values:
df = df[(df['Message'].shift() != df['Message']) | (df['Message'] != 'Work in progress...')]
print (df)
Timestamp Message
0 2018-01-02 03:00:00 Message received.
1 2018-01-02 11:00:00 Sending...
2 2018-01-03 04:00:00 Sending...
3 2018-01-04 11:00:00 Sending...
4 2018-01-04 16:00:00 Work in progress...
6 2018-01-05 05:00:00 Message received.
7 2018-01-05 11:00:00 Sending...
8 2018-01-05 17:00:00 Sending...
9 2018-01-06 02:00:00 Work in progress...
10 2018-01-06 14:00:00 Message received.
11 2018-01-07 07:00:00 Sending...
12 2018-01-07 20:00:00 Sending...
13 2018-01-08 01:00:00 Sending...
14 2018-01-08 02:00:00 Work in progress...
17 2018-01-10 03:00:00 Message received.
18 2018-01-10 09:00:00 Sending...
19 2018-01-10 14:00:00 Sending...