mattermost/server/enterprise/message_export/shared/export_data.go
Christopher Poile aba4434dab
MM-59966 - Compliance Export overhaul - feature branch (#29789)
* [MM-59089] Add a compliance export constant (#27919)

* add a useful constant

* i18n

* another constant

* another i18n

* [MM-60422] Add GetChannelsWithActivityDuring (#28301)

* modify GetUsersInChannelDuring to accept a slice of channelIds

* add GetChannelsWithActivityDuring

* add compliance export progress message; remove unused custom status

* linting

* tests running too fast

* add batch size config settings

* add store tests

* linting

* empty commit

* i18n changes

* fix i18n ordering

* MM-60570 - Server-side changes consolidating the export CLI with server/ent code (#28640)

* add an i18n field; add the CLI's export directory

* int64 -> int

* Add UntilUpdateAt for MessageExport and AnalyticsPostCount

to merge

* remove now-unused i18n strings

* add TranslationsPreInitFromBuffer to allow CLI to use i18n

* use GetBuilder to simplify; rename TranslationsPreInitFromFileBytes

* [MM-59089] Improve compliance export timings (#1733 - Enterprise repo)

* MM-60422 - Performance and logic fixes for Compliance Exports (#1757 - Enterprise repo)

* MM-60570 - Enterprise-side changes consolidating the export CLI with server/ent code (#1769 - Enterprise repo)

* merge conflicts; missed file from ent branch

* MM-61038 - Add an option to sqlstore.New (#28702)

remove useless comment

add test

add an option to sqlstore.New

* MM-60976: Remove RunExport command from Mattermost binary (#28805)

* remove RunExport command from mattermost binary

* remove the code it was calling

* fix i18n

* remove test (was only testing license, not functionality)

* empty commit

* fix flaky GetChannelsWithActivityDuring test

* MM-60063: Dedicated Export Filestore fix, redo of #1772 (enterprise) (#28803)

* redo filestore fix #1772 (enterprise repo) on top of MM-59966 feature

* add new e2e tests for export filestore

* golint

* ok, note to self: shadowing bad, actually (when there's a defer)

* empty commit

* MM-61137 - Message export: Support 7.8.11 era dbs (#28824)

* support 7.8.11 era dbs by wrapping the store using only what we need

* fix flaky GetChannelsWithActivityDuring test

* add a comment

* only need to define the MEFileInfoStore (the one that'll be overridden)

* blank commit

* MM-60974 - Message Export: Add performance metrics (#28836)

* support 7.8.11 era dbs by wrapping the store using only what we need

* fix flaky GetChannelsWithActivityDuring test

* add a comment

* only need to define the MEFileInfoStore (the one that'll be overridden)

* performance metrics

* cleanup unneeded named returns

* blank commit

* MM-60975 - Message export: Add startTime and endTime to export folder name (#28840)

* support 7.8.11 era dbs by wrapping the store using only what we need

* fix flaky GetChannelsWithActivityDuring test

* add a comment

* only need to define the MEFileInfoStore (the one that'll be overridden)

* performance metrics

* output startTime and endTime in export folder

* empty commit

* merge conflict

* MM-60978 - Message export: Improve xml fields; fix delete semantics (#28873)

* support 7.8.11 era dbs by wrapping the store using only what we need

* fix flaky GetChannelsWithActivityDuring test

* add a comment

* only need to define the MEFileInfoStore (the one that'll be overridden)

* performance metrics

* output startTime and endTime in export folder

* empty commit

* add xml fields, omit when empty, tests

* fix delete semantics; test (and test for update semantics)

* clarify comments

* simplify edited post detection, now there's no edge case.

* add some spacing to help fast running tests

* merge conflicts/updates needed for new deleted post semantics

* linting; fixing tests from upstream merge

* use SafeDereference

* linting

* stronger typing; better wrapped errors; better formatting

* blank commit

* goimports formatting

* fix merge mistake

* minor fixes due to changes in master

* MM-61755 - Simplifying and Support reporting to the db from the CLI (#29281)

* finally clean up JobData struct and stringMap; prep for CLI using db

* and now simplify using StringMapToJobDataWithZeroValues

* remove unused fn

* create JobDataExported; clean up errors

* MM-60176 - Message Export: Global relay cleanup (#29168)

* move global relay logic into global_relay_export

* blank commit

* blank commit

* improve errors

* MM-60693 - Refactor CSV to use same codepath as Actiance (#29191)

* move global relay logic into global_relay_export

* blank commit

* refactor (and simplify) ExportParams into shared

* blank commit

* remove unused fn

* csv now uses pre-calculated joins/leaves like actiance

* improve errors

* remove nil post check; remove ignoredPosts metric

* remove unneeded copy

* MM-61696 - Refactor GlobalRelay to use same codepath as Actiance (#29225)

* move global relay logic into global_relay_export

* blank commit

* refactor (and simplify) ExportParams into shared

* blank commit

* remove unused fn

* csv now uses pre-calculated joins/leaves like actiance

* remove newly unneeded function and its test. goodbye.

* refactor GetPostAttachments for csv + global relay to share

* refactor global_relay_export and fix tests (no changes to output)

* improve errors

* remove nil post check; remove ignoredPosts metric

* remove unneeded copy

* remove unneeded nil check

* PR comments

* MM-61715 - Generalize e2e to all export types 🤖  (#29369)

* move global relay logic into global_relay_export

* blank commit

* refactor (and simplify) ExportParams into shared

* blank commit

* remove unused fn

* csv now uses pre-calculated joins/leaves like actiance

* remove newly unneeded function and its test. goodbye.

* refactor GetPostAttachments for csv + global relay to share

* refactor global_relay_export and fix tests (no changes to output)

* improve errors

* remove nil post check; remove ignoredPosts metric

* remove unneeded copy

* remove unneeded nil check

* PR comments

* refactor isDeletedMsg for all export types

* fix start and endtime, nasty csv createAt bug; bring closer to Actiance

* align unit tests with new logic (e.g. starttime / endtime)

* refactor a TimestampConvert fn for code + tests

* bug: pass templates to global relay (hurray for e2e tests, otherwise...)

* add global relay zip to allowed list (only for tests)

* test helpers

* new templates for e2e tests

* e2e tests... phew.

* linting

* merge conflicts

* unexport PostToRow; add test helper marker

* cleanup, shortening, thanks to PR comments

* MM-61972 - Generalize export data path - Actiance (#29399)

* extract and generalize the export data generation functions

* finish moving test (bc of previous extraction)

* lift a function from common -> shared (to break an import cycle)

* actiance now takes general export data, processes it into actiance data

* bring tests in line with correct sorting rules (upadateAt, messageId)

* fixups, PR comments

* turn strings.Repeat into a more descriptive const

amended: one letter fix; bad rebase

* MM-62009 - e2e clock heisenbug (#29434)

* consolidate assertions; output debuggable diffs (keeping for future)

* refactor test output generator to generators file

* waitUntilZeroPosts + pass through until to job = fix all clock issues

* simplify messages to model.NewId(); remove unneeded waitUntilZeroPosts

* model.NewId() -> storetest.NewTestID()

* MM-61980 - Generalize export data path - CSV (#29482)

* simple refactoring

* increase sleep times for (very) rare test failures

* add extra information to the generic export for CSV

* adj Actiance to handle new generic export (no difference in its output)

* no longer need mergePosts (yay), move getJoinLeavePosts for everyone

* adjust tests for new csv semantics (detailed in summary)

* and need to add the new exported data to the export_data_tests

* rearrange csv writing to happen after data export (more logical)

* linting

* remove debug statements

* figured out what was wrong with global relay e2e test 3; solid now

* PR comments

* MM-61718 - Generalize export data path - Global Relay (#29508)

* move global relay over to using the generalized export data

* performance pass -- not much can be done

* Update server/enterprise/message_export/global_relay_export/global_relay_export.go

Co-authored-by: Claudio Costa <cstcld91@gmail.com>

---------

Co-authored-by: Claudio Costa <cstcld91@gmail.com>

* MM-62058 - Align CSV with Actiance (#29551)

* refactoring actiance files and var names for clarity

* bug found in exported attachments (we used to miss some start/ends)

* changes needed for actiance due to new generic exports

* bringing CSV up to actiance standards

* fixing global relay b/c of new semantics (adding a note on an edge case)

* aligning e2e tests, adding comments to clarify what is expected/tested

* necessary changes; 1 more test for added functionality (ignoreDeleted)

* comment style

* MM-62059 - Align Global Relay with Actiance/CSV; many fixes (#29665)

* core logic changes to general export_data and the specific export paths

* unit tests and e2e tests, covering all new edge cases and all logic

* linting

* better var naming, const value, and cleaning up functions calls

* MM-62436 - Temporarily skip cypress tests that require download link (#29772)

---------

Co-authored-by: Claudio Costa <cstcld91@gmail.com>
2025-01-10 16:56:02 -05:00

543 lines
20 KiB
Go

// Copyright (c) 2015-present Mattermost, Inc. All Rights Reserved.
// See LICENSE.enterprise for license information.
package shared
import (
"sort"
"github.com/mattermost/mattermost/server/public/model"
)
type UserType string
const (
User UserType = "user"
Bot UserType = "bot"
)
type PostExport struct {
model.MessageExport // the MessageExport that this PostExport is providing more information for
UserType UserType // the type of the person that sent the post: "user" or "bot"
// Allows us to differentiate between:
// - "EditedOriginalMsg": the newly created message (new Id), which holds the pre-edited message contents. The
// "EditedNewMsgId" field will point to the message (original Id) which has the post-edited message content.
// - "EditedNewMsg": the post-edited message content. This is confusing, so be careful: in the db, this EditedNewMsg
// is actually the original messageId because we wanted an edited message to have the same messageId as the
// pre-edited message. But for the purposes of exporting and to keep the mental model clear for end-users, we are
// calling this the EditedNewMsg and EditedNewMsgId, because this will hold the NEW post-edited message contents,
// and that's what's important to the end-user viewing the export.
// - "UpdatedNoMsgChange": the message content hasn't changed, but the post was updated for some reason (reaction,
// replied-to, a reply was edited, a reply was deleted (as of 10.2), perhaps other reasons)
// - "Deleted": the message was deleted.
// - "FileDeleted": this message is recording that a file was deleted.
UpdatedType PostUpdatedType
UpdateAt int64 // if this is an updated post, this is the updated time (same as deleted time for deleted posts).
// when a message is edited, the EditedOriginalMsg points to the message Id that now has the newly edited message.
EditedNewMsgId string
Message string // the text body of the post
PreviewsPost string // the post id of the post that is previewed by the permalink preview feature
AttachmentCreates []*FileUploadStartExport // the post's attachments that were uploaded this export period
AttachmentDeletes []PostExport // the post's attachments that were deleted
FileInfo *model.FileInfo // if this was a file PostExport, FileInfo will contain that info. Otherwise, nil.
}
type FileUploadStartExport struct {
model.MessageExport // the post that this upload was attached to
UserEmail string // the email of the person that sent the file
UploadStartTime int64 // utc timestamp (seconds), time at which the user started the upload. Example: 1366611728
FileInfo *model.FileInfo
}
type FileUploadStopExport struct {
model.MessageExport // the post that this upload was attached to
UserEmail string // the email of the person that sent the file
UploadStopTime int64 // utc timestamp (seconds), time at which the user finished the upload. Example: 1366611728
Status string // set to either "Completed" or "Failed" depending on the outcome of the upload operation
FileInfo *model.FileInfo
}
type ChannelExport struct {
ChannelId string
ChannelType model.ChannelType
ChannelName string
DisplayName string
StartTime int64 // utc timestamp (milliseconds), start of export period or create time of channel, whichever is greater.
EndTime int64 // utc timestamp (milliseconds), end of export period or delete time of channel, whichever is lesser.
Posts []PostExport
Files []*model.FileInfo
DeletedFiles []PostExport
UploadStarts []*FileUploadStartExport
UploadStops []*FileUploadStopExport
JoinEvents []JoinExport // start with a list of all users who were present in the channel during the export period
LeaveEvents []LeaveExport // finish with a list of all users who were present in the channel during the export period
// Used by csv, ignored by others
TeamId string
TeamName string
TeamDisplayName string
}
type JoinExport struct {
UserId string
Username string
UserEmail string // the email of the person that joined the channel
UserType UserType // the type of the user that joined the channel
JoinTime int64 // utc timestamp (seconds), time at which the user joined. Example: 1366611728
// when the user left (or batch endTime if they didn't leave). Only used by GlobalRelay
LeaveTime int64
}
type LeaveExport struct {
UserId string
Username string
UserEmail string // the email of the person that left the channel
UserType UserType // the type of the user that left the channel
LeaveTime int64 // utc timestamp (seconds), time at which the user left. Example: 1366611728
// ClosedOut indicates this is a "leave" event created by closing out the channel at the end of an export period.
// Actiance requires all users to be closed out at the end of an export period (each join has a matching leave).
ClosedOut bool
}
type GenericExportData struct {
Exports []ChannelExport
Metadata Metadata
Results RunExportResults
}
// GetGenericExportData assembles all the data in an exportType-agnostic way. Each exportType will process this data into
// the specific format they need to export.
func GetGenericExportData(p ExportParams) (GenericExportData, error) {
// postAuthorsByChannel is a map so that we don't store duplicate authors
postAuthorsByChannel := make(map[string]map[string]ChannelMember)
metadata := Metadata{
Channels: p.ChannelMetadata,
MessagesCount: 0,
AttachmentsCount: 0,
StartTime: p.BatchStartTime,
EndTime: p.BatchEndTime,
}
var results RunExportResults
channelsInThisBatch := make(map[string]bool)
postsByChannel := make(map[string][]PostExport)
filesByChannel := make(map[string][]*model.FileInfo)
uploadStartsByChannel := make(map[string][]*FileUploadStartExport)
uploadStopsByChannel := make(map[string][]*FileUploadStopExport)
deletedFilesByChannel := make(map[string][]PostExport)
processPostAttachments := func(post *model.MessageExport, postExport PostExport, originalPostThatWillBeDeletedLater bool) error {
// originalPostThatWillBeDeletedLater means we are recording this message's original file starts and stops,
// before it was deleted (we'll record that next call to this function)
//
// NOTE: there is an edge case here: the original post is not deleted but the attachment has been deleted.
// See the note in the postToAttachmentsEntries func.
channelId := *post.ChannelId
uploadedFiles, startUploads, stopUploads, deleteFileMessages, err :=
postToAttachmentsEntries(post, p.Db, originalPostThatWillBeDeletedLater)
if err != nil {
return err
}
uploadStartsByChannel[channelId] = append(uploadStartsByChannel[channelId], startUploads...)
uploadStopsByChannel[channelId] = append(uploadStopsByChannel[channelId], stopUploads...)
deletedFilesByChannel[channelId] = append(deletedFilesByChannel[channelId], deleteFileMessages...)
filesByChannel[channelId] = append(filesByChannel[channelId], uploadedFiles...)
postExport.AttachmentCreates = startUploads
postExport.AttachmentDeletes = deleteFileMessages
postsByChannel[channelId] = append(postsByChannel[channelId], postExport)
results.UploadedFiles += len(startUploads)
results.DeletedFiles += len(deleteFileMessages)
// only count uploaded files (not deleted files)
if err := metadata.UpdateCounts(channelId, 1, len(startUploads)); err != nil {
return err
}
return nil
}
for _, post := range p.Posts {
channelId := *post.ChannelId
channelsInThisBatch[channelId] = true
// Was the post deleted (not an edited post), and originally posted during the current job window?
// If so, we need to record it. It may actually belong in an earlier batch, but there's no way to know that
// before now because of the way we export posts (by updateAt).
if IsDeletedMsg(post) && !isEditedOriginalMsg(post) && *post.PostCreateAt >= p.JobStartTime {
results.CreatedPosts++
postExport := createdPostToExportEntry(post)
if err := processPostAttachments(post, postExport, true); err != nil {
return GenericExportData{}, err
}
}
var postExport PostExport
postExport, results = getPostExport(post, results)
if err := processPostAttachments(post, postExport, false); err != nil {
return GenericExportData{}, err
}
if _, ok := postAuthorsByChannel[channelId]; !ok {
postAuthorsByChannel[channelId] = make(map[string]ChannelMember)
}
postAuthorsByChannel[channelId][*post.UserId] = ChannelMember{
UserId: *post.UserId,
Email: *post.UserEmail,
Username: *post.Username,
IsBot: post.IsBot,
}
}
// If the channel is not in channelsInThisBatch (i.e. if it didn't have a post), we need to check if it had
// user activity between this batch's startTime-endTime. If so, add it to the channelsInThisBatch.
for id := range p.ChannelMetadata {
if !channelsInThisBatch[id] {
if ChannelHasActivity(p.ChannelMemberHistories[id], p.BatchStartTime, p.BatchEndTime) {
channelsInThisBatch[id] = true
}
}
}
// Build the channel exports for the channels that had post or user join/leave activity this batch.
channelExports := make([]ChannelExport, 0, len(channelsInThisBatch))
for id := range channelsInThisBatch {
c := metadata.Channels[id]
joinEvents, leaveEvents := getJoinsAndLeaves(p.BatchStartTime, p.BatchEndTime,
p.ChannelMemberHistories[id], postAuthorsByChannel[id])
// We don't have teamName and teamDisplayName from the channelMetaData, but we have it from MessageExport.
// However, if we don't have posts for this channel (only joins and leaves), then we don't have it at all.
var teamName, teamDisplayName string
if posts, ok := postsByChannel[id]; ok {
if len(posts) > 0 {
teamName = model.SafeDereference(posts[0].TeamName)
teamDisplayName = model.SafeDereference(posts[0].TeamDisplayName)
}
}
channelExports = append(channelExports, ChannelExport{
ChannelId: c.ChannelId,
ChannelType: c.ChannelType,
ChannelName: c.ChannelName,
DisplayName: c.ChannelDisplayName,
StartTime: p.BatchStartTime,
EndTime: p.BatchEndTime,
Posts: postsByChannel[id],
Files: filesByChannel[id],
DeletedFiles: deletedFilesByChannel[id],
UploadStarts: uploadStartsByChannel[id],
UploadStops: uploadStopsByChannel[id],
JoinEvents: joinEvents,
LeaveEvents: leaveEvents,
TeamId: model.SafeDereference(c.TeamId),
TeamName: teamName,
TeamDisplayName: teamDisplayName,
})
results.Joins += len(joinEvents)
results.Leaves += len(leaveEvents)
}
return GenericExportData{channelExports, metadata, results}, nil
}
// postToAttachmentsEntries returns every fileInfo as uploadedFiles. It also adds each file into the lists:
//
// startUploads, stopUploads, and deleteFileMessages (for ActianceExport).
// If onlyDeleted = true, only export the deleted entry (if it exists).
func postToAttachmentsEntries(post *model.MessageExport, db MessageExportStore, ignoreDeleted bool) (
uploadedFiles []*model.FileInfo, startUploads []*FileUploadStartExport, stopUploads []*FileUploadStopExport, deleteFileMessages []PostExport, err error) {
// if the post included any files, we need to add special elements to the export.
//
// NOTE: there is an edge case here: the original post is not deleted but the attachment has been deleted.
// 1. If the attachment is deleted sometime later in the future but the post has not been updated (updateAt == createAt)
// then we won't ever get here, there's nothing we can do.
// 2. If the attachment is deleted within this export period, or the post is updated this export period and we
// now see that the attachment is deleted, then we have to output 2 files here:
// a) the original file attachment
// b) the deleted file attachment
// These have to be added the start/stop and deletedFileMessages
if len(post.PostFileIds) == 0 {
return
}
uploadedFiles, err = db.FileInfo().GetForPost(*post.PostId, true, true, false)
if err != nil {
return
}
for _, fileInfo := range uploadedFiles {
if fileInfo.DeleteAt > 0 && !ignoreDeleted {
deleteFileMessages = append(deleteFileMessages, deleteFileToExportEntry(post, fileInfo))
// this was a deleted file, so do not record its start and stop. If the original message was sent in this
// batch, the file transfer will have been exported earlier up when the original message was exported.
//
// However, because of the edge case above, we still need to record start and stop if the post is not deleted.
if IsDeletedMsg(post) {
continue
} // not deleted, so need to add start and stop below.
}
// insert a record of the file upload into the export file
// path to exported file is relative to the fileAttachmentFilestore root,
// which could be different from the exportFilestore root
startUploads = append(startUploads, &FileUploadStartExport{
MessageExport: *post,
UserEmail: *post.UserEmail,
UploadStartTime: *post.PostCreateAt,
FileInfo: fileInfo,
})
stopUploads = append(stopUploads, &FileUploadStopExport{
MessageExport: *post,
UserEmail: *post.UserEmail,
UploadStopTime: *post.PostCreateAt,
Status: "Completed",
FileInfo: fileInfo,
})
}
return
}
func getPostExport(post *model.MessageExport, results RunExportResults) (PostExport, RunExportResults) {
// We have three "kinds" of posts:
// (using "1" and "2" for simplicity)
// - created: Id = new, CreateAt = 1, UpdateAt = 1, DeleteAt = 0
// - deleted: Id = orig, CreateAt = orig, UpdateAt = 2, DeleteAt = 2, props: deleteBy
// - edited: old post gets "created": Id = new, CreateAt = 1, UpdateAt = 2, DeleteAt = 2, originalId: orig
// existing post modified: Id = orig, CreateAt = 1, UpdateAt = 2, DeleteAt = 0
//
// We also have other ways for a post to be updated:
// - a root post in a thread is replied to, when a reply is edited, or (as of 10.2) when a reply is deleted
if isEditedOriginalMsg(post) {
// Post has been edited. This is the original message.
results.EditedOrigMsgPosts++
return editedOriginalMsgToExportEntry(post), results
} else if IsDeletedMsg(post) {
// Post is deleted
results.DeletedPosts++
return deletedPostToExportEntry(post, "delete "+*post.PostMessage), results
} else if *post.PostUpdateAt > *post.PostCreateAt {
// Post has been updated. But what kind?
if model.SafeDereference(post.PostEditAt) > 0 {
// This is an edited post.
results.EditedNewMsgPosts++
return editedNewMsgToExportEntry(post), results
}
// This is just an updated post (e.g. reaction)
results.UpdatedPosts++
return updatedPostToExportEntry(post), results
}
// Post is newly created:
// *post.PostCreateAt == *post.PostUpdateAt && (post.PostDeleteAt == nil || *post.PostDeleteAt == 0)
// but also fallback to this in case there is missing data, which is better than not exporting anything.
results.CreatedPosts++
return createdPostToExportEntry(post), results
}
func getJoinsAndLeaves(startTime int64, endTime int64, channelMembersHistory []*model.ChannelMemberHistoryResult,
postAuthors map[string]ChannelMember) ([]JoinExport, []LeaveExport) {
var leaveEvents []LeaveExport
joins, leaves := GetJoinsAndLeavesForChannel(startTime, endTime, channelMembersHistory, postAuthors)
joinsById := make(map[string]JoinExport, len(joins))
type StillMemberInfo struct {
time int64
userType UserType
userId string
username string
}
stillMember := map[string]StillMemberInfo{}
for _, join := range joins {
userType := User
if join.IsBot {
userType = Bot
}
joinsById[join.UserId] = JoinExport{
UserId: join.UserId,
Username: join.Username,
UserEmail: join.Email,
JoinTime: join.Datetime,
UserType: userType,
LeaveTime: endTime,
}
if value, ok := stillMember[join.Email]; !ok {
stillMember[join.Email] = StillMemberInfo{time: join.Datetime, userType: userType, userId: join.UserId, username: join.Username}
} else if join.Datetime > value.time {
stillMember[join.Email] = StillMemberInfo{time: join.Datetime, userType: userType, userId: join.UserId, username: join.Username}
}
}
for _, leave := range leaves {
userType := User
if leave.IsBot {
userType = Bot
}
leaveEvents = append(leaveEvents, LeaveExport{
UserId: leave.UserId,
Username: leave.Username,
UserEmail: leave.Email,
LeaveTime: leave.Datetime,
UserType: userType,
})
if leave.Datetime > stillMember[leave.Email].time {
delete(stillMember, leave.Email)
}
// record their leave in their initial join
if join, ok := joinsById[leave.UserId]; ok {
join.LeaveTime = leave.Datetime
joinsById[leave.UserId] = join
}
}
// Closing-out the channel for Actiance (each join must have a matching leave).
for email := range stillMember {
leaveEvents = append(leaveEvents, LeaveExport{
UserId: stillMember[email].userId,
Username: stillMember[email].username,
LeaveTime: endTime,
UserEmail: email,
UserType: stillMember[email].userType,
ClosedOut: true,
})
}
joinEvents := make([]JoinExport, 0, len(joinsById))
for _, v := range joinsById {
joinEvents = append(joinEvents, v)
}
sort.Slice(joinEvents, func(i, j int) bool {
if joinEvents[i].JoinTime == joinEvents[j].JoinTime {
return joinEvents[i].UserEmail < joinEvents[j].UserEmail
}
return joinEvents[i].JoinTime < joinEvents[j].JoinTime
})
sort.Slice(leaveEvents, func(i, j int) bool {
if leaveEvents[i].LeaveTime == leaveEvents[j].LeaveTime {
return leaveEvents[i].UserEmail < leaveEvents[j].UserEmail
}
return leaveEvents[i].LeaveTime < leaveEvents[j].LeaveTime
})
return joinEvents, leaveEvents
}
func isEditedOriginalMsg(post *model.MessageExport) bool {
return model.SafeDereference(post.PostDeleteAt) > 0 && model.SafeDereference(post.PostOriginalId) != ""
}
func createdPostToExportEntry(post *model.MessageExport) PostExport {
userType := User
if post.IsBot {
userType = Bot
}
return PostExport{
MessageExport: *post,
Message: *post.PostMessage,
UserType: userType,
PreviewsPost: post.PreviewID(),
}
}
func deletedPostToExportEntry(post *model.MessageExport, newMsg string) PostExport {
userType := User
if post.IsBot {
userType = Bot
}
return PostExport{
MessageExport: *post,
UpdateAt: *post.PostDeleteAt,
UpdatedType: Deleted,
Message: newMsg,
UserType: userType,
PreviewsPost: post.PreviewID(),
}
}
func editedOriginalMsgToExportEntry(post *model.MessageExport) PostExport {
userType := User
if post.IsBot {
userType = Bot
}
return PostExport{
MessageExport: *post,
UpdateAt: *post.PostUpdateAt,
UpdatedType: EditedOriginalMsg,
Message: *post.PostMessage,
UserType: userType,
PreviewsPost: post.PreviewID(),
EditedNewMsgId: *post.PostOriginalId,
}
}
func editedNewMsgToExportEntry(post *model.MessageExport) PostExport {
userType := User
if post.IsBot {
userType = Bot
}
return PostExport{
MessageExport: *post,
UpdateAt: *post.PostUpdateAt,
UpdatedType: EditedNewMsg,
Message: *post.PostMessage,
UserType: userType,
PreviewsPost: post.PreviewID(),
}
}
func updatedPostToExportEntry(post *model.MessageExport) PostExport {
userType := User
if post.IsBot {
userType = Bot
}
return PostExport{
MessageExport: *post,
UpdateAt: *post.PostUpdateAt,
UpdatedType: UpdatedNoMsgChange,
Message: *post.PostMessage,
UserType: userType,
PreviewsPost: post.PreviewID(),
}
}
func deleteFileToExportEntry(post *model.MessageExport, fileInfo *model.FileInfo) PostExport {
userType := User
if post.IsBot {
userType = Bot
}
return PostExport{
MessageExport: *post,
UpdateAt: fileInfo.DeleteAt,
UpdatedType: FileDeleted,
Message: "delete " + fileInfo.Path,
UserType: userType,
PreviewsPost: post.PreviewID(),
FileInfo: fileInfo,
}
}
func UploadStartToExportEntry(u *FileUploadStartExport) PostExport {
userType := User
if u.IsBot {
userType = Bot
}
return PostExport{
MessageExport: u.MessageExport,
UpdateAt: u.FileInfo.UpdateAt,
UserType: userType,
FileInfo: u.FileInfo,
}
}