It's been about half a year since I last updated posts on Motion Matching in Unreal Engine. With the recent release of UE5.4, there have been significant updates regarding Motion Matching. In this post, we will delve into the source code of Motion Matching in UE5.4. Once the official demo project is released, we will further analyze its contents. Due to the complexity of this content, I'll try to explain the fundamental algorithms to everyone while overlooking the code that optimizes performance. Please forgive me for this.

This post doesn't focus on telling you how to use Motion Matching. Check this if you know nothing about the plugin Pose Search:

Introduction To Motion Matching (UE 5.4) | Community tutorial
A short 13 minute primer to getting very basic motion matching working in Unreal Engine 5.4 while we all wait for that awesome sample project with the 5...
dev.epicgames.com

0. Primilinary knowledge

It's recommended to read the previous articles in this series to understand the mathematical principles of Motion Matching. At least you should read:

Motion Matching in Unreal Engine 5(2) : Motion Matching in For Honor – the Walled Garden
jiahaoli.org

Mathematical principle in short :

Using motion matching requires for a Feature Vector, which contains the element we are concerned. For example, if we want to concern the rotation and velocity(2D, ignore the velocity on Z-axis) of the feet of the character, we first create a feature vector contains these elements:

V_{feature} =
\begin{bmatrix}
r_{foot_{lx}} \\
r_{foot_{ly}} \\
r_{foot_{lz}} \\
r_{foot_{rx}} \\
r_{foot_{ry}} \\
r_{foot_{rz}} \\
v_{foot_{lx}} \\
v_{foot_{ly}} \\
v_{foot_{rx}} \\
v_{foot_{ry}} \\
\end{bmatrix}

Then, a classical method to achieve the motion matching is that using every frame in the animation database to create a matrix composed by feature vector, we take each frame as a column of the matrix, which can be also thought as a feature vector of the frame, in the following expression, f1 means the frame1.

M_{animation-database} =
\begin{bmatrix}
V_{f_1} & V_{f_2} & \dots & V_{f_n}
\end{bmatrix} \\ 
=
\begin{bmatrix}
r_{foot_{lx-f_1}} & r_{foot_{lx-f_2}} & \dots & r_{foot_{lx-f_n}}\\
r_{foot_{ly-f_1}} & r_{foot_{lx-f_2}} & \dots & r_{foot_{ly-f_n}}\\
r_{foot_{lz-f_1}} & r_{foot_{lx-f_2}} & \dots & r_{foot_{lz-f_n}}\\
r_{foot_{rx-f_1}} & r_{foot_{lx-f_2}} & \dots & r_{foot_{rx-f_n}}\\
r_{foot_{ry-f_1}} & r_{foot_{lx-f_2}} & \dots & r_{foot_{ry-f_n}}\\
r_{foot_{rz-f_1}} & r_{foot_{lx-f_2}} & \dots & r_{foot_{rz-f_n}}\\
v_{foot_{lx-f_1}} & r_{foot_{lx-f_2}} & \dots & v_{foot_{lx-f_n}}\\
v_{foot_{ly-f_1}} & r_{foot_{lx-f_2}} & \dots & v_{foot_{ly-f_n}}\\
v_{foot_{rx-f_1}} & r_{foot_{lx-f_2}} & \dots & v_{foot_{rx-f_n}}\\
v_{foot_{ry-f_1}} & r_{foot_{lx-f_2}} & \dots & v_{foot_{ry-f_n}}\\
\end{bmatrix}

For each frame(gameplay frame), we find a column in the animation database, which lets:

i = \arg\min_{j \in \{1, 2, \ldots, n\}} \|V_j, V_k\|

The animation frame which is correspounding to the column i of the matrix will be chosen to play this frame.

1. Brief

We starts with a top-down method. UE5.4 provides many ways for users to enable Motion Matching, but our most familiar way is using the animation node in animation blueprint.

in PoseSearchLibrary.h, they provides three different methods for us to execute the core algoithm of Motion Matching. However, by now the latter two has no usage. In UE5.4, Pose Search allows us to update multiple character in one method. That is MotionMatchMulti().

PoseSearchLibrary.h
/**
	* Implementation of the core motion matching algorithm
	*
	* @param Context						Input animation update context providing access to the proxy and delta time
	* @param Databases						Input array of databases to search
	* @param BlendTime						Input time in seconds to blend out to the new pose. Uses either inertial blending, requiring an Inertialization node after this node, or the internal blend stack, if MaxActiveBlends is greater than zero.
	* @param MaxActiveBlends				Input number of max active animation segments being blended together in the blend stack. If MaxActiveBlends is zero then the blend stack is disabled.
	* @param PoseJumpThresholdTime			Input don't jump to poses of the same segment that are within the interval this many seconds away from the continuing pose.
	* @param PoseReselectHistory			Input prevent re-selection of poses that have been selected previously within this much time (in seconds) in the past. This is across all animation segments that have been selected within this time range.
	* @param SearchThrottleTime				Input minimum amount of time to wait between searching for a new pose segment. It allows users to define how often the system searches, default for locomotion is searching every update, but you may only want to search once for other situations, like jump.
	* @param PlayRate						Input effective range of play rate that can be applied to the animations to account for discrepancies in estimated velocity between the movement modeland the animation.
	* @param InOutMotionMatchingState		Input/Output encapsulated motion matching algorithm and state
	* @param InterruptMode					Input continuing pose search interrupt mode
	* @param bShouldSearch					Input if false search will happen only if there's no valid continuing pose
	* @param bDebugDrawQuery				Input draw the composed query if valid
	* @param bDebugDrawCurResult			Input draw the current result if valid
	*/
	static void UpdateMotionMatchingState(
		const FAnimationUpdateContext& Context,
		const TArray<TObjectPtr<const UPoseSearchDatabase>>& Databases,
		float BlendTime,
		int32 MaxActiveBlends,
		const FFloatInterval& PoseJumpThresholdTime,
		float PoseReselectHistory,
		float SearchThrottleTime,
		const FFloatInterval& PlayRate,
		FMotionMatchingState& InOutMotionMatchingState,
		EPoseSearchInterruptMode InterruptMode = EPoseSearchInterruptMode::DoNotInterrupt,
		bool bShouldSearch = true,
		bool bShouldUseCachedChannelData = true,
		bool bDebugDrawQuery = false,
		bool bDebugDrawCurResult = false);

	/**
	* Implementation of the core motion matching algorithm
	*
	* @param AnimInstance					Input animation instance
	* @param AssetsToSearch					Input assets to search (UPoseSearchDatabase or any animation asset containing UAnimNotifyState_PoseSearchBranchIn)
	* @param PoseHistoryName				Input tag of the associated PoseSearchHistoryCollector node in the anim graph
	* @param Future							Input future properties to match (animation / start time / time offset)
	* @param SelectedAnimation				Output selected animation from the Database asset
	* @param Result							Output FPoseSearchBlueprintResult with the search result
	* @param DebugSessionUniqueIdentifier	Input unique identifier used to identify TraceMotionMatchingState (rewind debugger / pose search debugger) session. Similarly the MM node uses Context.GetCurrentNodeId()
	*/
	UFUNCTION(BlueprintPure, Category = "Animation|Pose Search|Experimental", meta = (BlueprintThreadSafe, Keywords = "PoseMatch"))
	static void MotionMatch(
		UAnimInstance* AnimInstance,
		TArray<UObject*> AssetsToSearch,
		const FName PoseHistoryName,
		FPoseSearchFutureProperties Future,
		FPoseSearchBlueprintResult& Result,
		const int32 DebugSessionUniqueIdentifier = 6174);

	/**
	* Implementation of the core motion matching algorithm for multiple characters
	*
	* @param AnimInstances					Input animation instances
	* @param Roles							Input Roles associated to the animation instances
	* @param AssetsToSearch					Input assets to search (UPoseSearchDatabase or any animation asset containing UAnimNotifyState_PoseSearchBranchIn)
	* @param PoseHistoryName				Input tag of the associated PoseSearchHistoryCollector node in the anim graphs of the AnimInstances
	* @param Result							Output FPoseSearchBlueprintResult with the search result
	* @param DebugSessionUniqueIdentifier	Input unique identifier used to identify TraceMotionMatchingState (rewind debugger / pose search debugger) session. Similarly the MM node uses Context.GetCurrentNodeId()
	*/
	UFUNCTION(BlueprintPure, Category = "Animation|Pose Search|Experimental", meta = (BlueprintThreadSafe, Keywords = "PoseMatch"))
	static void MotionMatchMulti(
		TArray<ACharacter*> AnimInstances,
		TArray<FName> Roles,
		TArray<UObject*> AssetsToSearch,
		const FName PoseHistoryName,
		FPoseSearchBlueprintResult& Result,
		const int32 DebugSessionUniqueIdentifier = 6174);
C++

So in this post we focus on the former one, the simplest way to use is that using the animation node, we used this method in Unreal Engine 5.3:

Using Motion Matching in Animation Blueprint

Well, behind this seemingly simple usage, there are too many classes and methods involved. So, I've created a simple diagram here to give you a basic impression of this process, and then we'll delve into a detailed discussion of each component in this diagram:

image-2

A simple diagram of the motion matching process. Please open in new page if it is too small to read.

2. UPoseSearchSchema

In application layer, if we want to use motion matching, the first thing we need to do is creating a PoseSearchSchema, a PoseSearchSchema tells the system that what is the feature vector, what skeleton we are using and many other important things.

image-4

Create a UPoseSearchSchema asset

image-6

Details of the UPoseSearchSchema asset

The most important property involves here is the Channels, It references to a array of UPoseSearchFeatureChannel. Each channel defines what it will write to the feature vector. For example, the UPoseSearchFeatureChannel_Velocity write the linear velocity of the bones we care about into the feature vector.

Fortunately, the purpose of each attribute here is extensively explained in the corresponding C++ code. So, you can find their explanations in the code I provide. However, for things like FSearchIndex, we'll need to work together to read the code and understand their functions.

PoseSearchSchema.h
UCLASS(BlueprintType, Category = "Animation|Pose Search", meta = (DisplayName = "Pose Search Schema"), CollapseCategories)
class POSESEARCH_API UPoseSearchSchema : public UDataAsset
{
	GENERATED_BODY()

public:
	UPROPERTY()
	TObjectPtr<USkeleton> Skeleton_DEPRECATED;
	
	// The update rate at which we sample the animation data in the database. The higher the SampleRate the more refined your searches will be, but the more memory will be required
	UPROPERTY(EditAnywhere, Category = "Schema", meta = (DisplayPriority = 3, ClampMin = "1", ClampMax = "240"))
	int32 SampleRate = 30;

private:
	UPROPERTY(EditAnywhere, Category = "Schema", meta = (DisplayPriority = 0))
	TArray<FPoseSearchRoledSkeleton> Skeletons;

	// Channels itemize the cost breakdown of the Schema in simpler parts such as position or velocity of a bones, or phase of limbs. The total cost of a query against an indexed database pose will be the sum of the combined channel costs
	UPROPERTY(EditAnywhere, Instanced, Category = "Schema")
	TArray<TObjectPtr<UPoseSearchFeatureChannel>> Channels;

	// FinalizedChannels gets populated with UPoseSearchFeatureChannel(s) from Channels and additional injected ones during the Finalize.
	UPROPERTY(Transient)
	TArray<TObjectPtr<UPoseSearchFeatureChannel>> FinalizedChannels;

public:
	UPROPERTY()
	TObjectPtr<UMirrorDataTable> MirrorDataTable_DEPRECATED;

#if WITH_EDITORONLY_DATA
	// Type of operation performed to the full pose features dataset
	UPROPERTY(EditAnywhere, Category = "Schema", meta = (DisplayPriority = 2))
	EPoseSearchDataPreprocessor DataPreprocessor = EPoseSearchDataPreprocessor::Normalize;
#endif //WITH_EDITORONLY_DATA

	UPROPERTY(Transient)
	int32 SchemaCardinality = 0;

#if WITH_EDITORONLY_DATA
	// How many times the animation assets of the database using this schema will be indexed.
	UPROPERTY(EditAnywhere, Category = "Permutations", meta = (ClampMin = "1"))
	int32 NumberOfPermutations = 1;

	// Delta time between every permutation indexing.
	UPROPERTY(EditAnywhere, Category = "Permutations", meta = (ClampMin = "1", ClampMax = "240", EditCondition = "NumberOfPermutations > 1", EditConditionHides))
	int32 PermutationsSampleRate = 30;

	// Starting offset of the "PermutationTime" from the "SamplingTime" of the first permutation.
	// subsequent permutations will have PermutationTime = SamplingTime + PermutationsTimeOffset + PermutationIndex / PermutationsSampleRate.
	UPROPERTY(EditAnywhere, Category = "Permutations")
	float PermutationsTimeOffset = 0.f;
#endif // WITH_EDITORONLY_DATA

	// if true a padding channel will be added to make sure the data is 16 bytes (aligned) and padded, to facilitate performance improvements at cost of eventual additional memory
	UPROPERTY(EditAnywhere, Category = "Performance")
	bool bAddDataPadding = false;

	// If bInjectAdditionalDebugChannels is true, channels will be asked to inject additional channels into this schema.
	// the original intent is to add UPoseSearchFeatureChannel_Position(s) to help with the complexity of the debug drawing
	// (the database will have all the necessary positions to draw lines at the right location and time).
	UPROPERTY(EditAnywhere, Category = "Debug")
	bool bInjectAdditionalDebugChannels;

...
...
...
C++

If we are implementing our own Feature Channel, the most important thing that we need focus is the ChannelCardinality, and how does it build the query. the channel cardinality means that how many components it will have in the final feature vector. For example, the UChannelCardinality_Velocity tracks the velocity of a certain bone. it should be X, Y, Z, the cardinality is 3. But if we choose strip Z the cardinality should be 2. The cardinality of the whole schema should be updated in UPoseSearchFeatureChannel::Finalize(UPoseSearchSchema* Schema). We take the UPoseSearchFeatureChannel_Velocity as an example and show you how to write the code.

PoseSearchChannel.h
// Feature channels interface
UCLASS(Abstract, BlueprintType, EditInlineNew)
class POSESEARCH_API UPoseSearchFeatureChannel : public UObject, public IBoneReferenceSkeletonProvider, public IPoseSearchFilter
{
	GENERATED_BODY()

public:
	int32 GetChannelCardinality() const { checkSlow(ChannelCardinality >= 0); return ChannelCardinality; }
	int32 GetChannelDataOffset() const { checkSlow(ChannelDataOffset >= 0); return ChannelDataOffset; }

	// Called during UPoseSearchSchema::Finalize to prepare the schema for this channel
	virtual bool Finalize(UPoseSearchSchema* Schema) PURE_VIRTUAL(UPoseSearchFeatureChannel::Finalize, return false;);
	
	// Called at runtime to add this channel's data to the query pose vector
	virtual void BuildQuery(UE::PoseSearch::FSearchContext& SearchContext) const PURE_VIRTUAL(UPoseSearchFeatureChannel::BuildQuery, );

	// UPoseSearchFeatureChannels can hold sub channels
	virtual TArrayView<TObjectPtr<UPoseSearchFeatureChannel>> GetSubChannels() { return TArrayView<TObjectPtr<UPoseSearchFeatureChannel>>(); }
	virtual TConstArrayView<TObjectPtr<UPoseSearchFeatureChannel>> GetSubChannels() const { return TConstArrayView<TObjectPtr<UPoseSearchFeatureChannel>>(); }

	virtual void AddDependentChannels(UPoseSearchSchema* Schema) const {}

	virtual EPermutationTimeType GetPermutationTimeType() const { return EPermutationTimeType::UseSampleTime; }
	static void GetPermutationTimeOffsets(EPermutationTimeType PermutationTimeType, float DesiredPermutationTimeOffset, float& OutPermutationSampleTimeOffset, float& OutPermutationOriginTimeOffset);
	...
	...
	...
C++
C++
bool UPoseSearchFeatureChannel_Velocity::Finalize(UPoseSearchSchema* Schema)
{
	ChannelDataOffset = Schema->SchemaCardinality;
	ChannelCardinality = UE::PoseSearch::FFeatureVectorHelper::GetVectorCardinality(ComponentStripping);
	Schema->SchemaCardinality += ChannelCardinality;

	SchemaBoneIdx = Schema->AddBoneReference(Bone, SampleRole);
	SchemaOriginBoneIdx = Schema->AddBoneReference(OriginBone, OriginRole);

	return SchemaBoneIdx >= 0 && SchemaOriginBoneIdx >= 0;
}

void UPoseSearchFeatureChannel_Velocity::BuildQuery(UE::PoseSearch::FSearchContext& SearchContext) const
{
	using namespace UE::PoseSearch;

	const bool bIsRootBone = SchemaBoneIdx == RootSchemaBoneIdx;
	if (bUseBlueprintQueryOverride)
	{
		const FVector LinearVelocityWorld = BP_GetWorldVelocity(SearchContext.GetAnimInstance(SampleRole));
		FVector LinearVelocity = SearchContext.GetSampleVelocity(SampleTimeOffset, OriginTimeOffset, SchemaBoneIdx, SchemaOriginBoneIdx, SampleRole, OriginRole, bUseCharacterSpaceVelocities, EPermutationTimeType::UseSampleTime, &LinearVelocityWorld);
		if (bNormalize)
		{
			LinearVelocity = LinearVelocity.GetClampedToMaxSize(1.f);
		}
		FFeatureVectorHelper::EncodeVector(SearchContext.EditFeatureVector(), ChannelDataOffset, LinearVelocity, ComponentStripping, false);
		return;
	}
	
	// The following are for optimization, we do not delve into them today.
C++

2. UPoseSearchDatabase

Next, let's take a look at UPoseSearchDatabase. This class addresses several issues: what animations we should input into the system, how these animations will be transformed into data, and how we can search through them. Why do we need to transform the input animations? First, in traditional animation systems, we are accustomed to viewing animations in terms of animation sequences. However, in Motion Matching, we consider the fundamental unit of the animation system to be animation frames. Therefore, we need to convert a large number of input animations into frames. Secondly, when we have defined a UPoseSearchSchema, we already know the position, velocity, rotation, and other information of the skeleton in each frame of these animations. We can pre-calculate feature vectors based on this information to reduce system overhead.

Let's start with how we input our animations.

(1). FPoseSearchDatabaseAnimationAssetBase

An FPoseSearchDatabaseAssetBase transforms a regular animation resource into an intermediate format, making it easier for these animations to be further processed. The "regular animation" here could be a variable asset. For example, in a regular animation system we use UAnimInstance, UBlendSpace and UAnimComposite and so on. This explains why FPoseSearchDatabaseAssetBase is abstract class, Each of its subclasses correspounding to a type of animation asset.

We can treat the FPoseSearchDatabaseAssetBase as some kind of meta data of animation asset. It describes some basic information of the animatio asset. For example, the duration of the animation, sampling range of the animation.

But remember, this is not an editor-only class, and the so called "intermediate format" is not that suitable. We have to keep them because in motion matching, we sometimes want the character stay in a specified animation, for example the walking and running animation. So we will add reduce the cost when calculating a continuing animation. To know what animation is a continuing, we have to keep the original animation asset. We will talk about this later when explaining the core algorithm of the motion matching.

You may find a weird attribute called "Role", what is it? Actually, A FRole in actually a FName, in Unreal Engine 5.4, you can update multiple characters at the same time, and each character uses a unique Role(FName) to get its output animation. I think it is a weird design, I don't know think it is a good idea to use it now. We ususally generate the character in runtime, so this means we have to give it a role? But sometimes in FPoseSearchDatabaseAnimationAssetBase and UPoseSearchSchema we already defined the Role! It's so weird, we will ignore the Role in this post.

PoseSearchDatabase.h
USTRUCT()
struct POSESEARCH_API FPoseSearchDatabaseAnimationAssetBase
{
	GENERATED_BODY()
	virtual ~FPoseSearchDatabaseAnimationAssetBase() = default;
	virtual UObject* GetAnimationAsset() const { return nullptr; }
	virtual float GetPlayLength() const;
	virtual int32 GetNumRoles() const { return 1; }
	virtual UE::PoseSearch::FRole GetRole(int32 RoleIndex) const { return UE::PoseSearch::DefaultRole; }
	virtual UAnimationAsset* GetAnimationAssetForRole(const UE::PoseSearch::FRole& Role) const;
	virtual const FTransform& GetRootTransformOriginForRole(const UE::PoseSearch::FRole& Role) const;

#if WITH_EDITOR
	virtual int32 GetFrameAtTime(float Time) const;
#endif // WITH_EDITOR

#if WITH_EDITORONLY_DATA
	virtual bool IsDisableReselection() const { return bDisableReselection; }
	virtual void SetDisableReselection(bool bValue) { bDisableReselection = bValue; }
	virtual UClass* GetAnimationAssetStaticClass() const { return nullptr; }
	virtual bool IsLooping() const { return false; }
	virtual const FString GetName() const { return FString(); }
	virtual bool IsEnabled() const { return bEnabled; }
	virtual void SetIsEnabled(bool bValue) { bEnabled = bValue; }
	virtual bool IsRootMotionEnabled() const { return false; }
	virtual EPoseSearchMirrorOption GetMirrorOption() const { return MirrorOption; }

	// [0, 0] represents the entire frame range of the original animation.
	virtual FFloatInterval GetSamplingRange() const { return FFloatInterval(0.f, 0.f); }
	static FFloatInterval GetEffectiveSamplingRange(const UAnimSequenceBase* Sequence, const FFloatInterval& RequestedSamplingRange);

	virtual int64 GetEditorMemSize() const;
	virtual int64 GetApproxCookedSize() const { return GetEditorMemSize(); }

	// This allows users to enable or exclude animations from this database. Useful for debugging.
	UPROPERTY(EditAnywhere, Category = "Settings", meta = (DisplayPriority = 1))
	bool bEnabled = true;

	// if bDisableReselection is true, poses from the same asset cannot be reselected. Useful to avoid jumping on frames on the same looping animations
	UPROPERTY(EditAnywhere, Category = "Settings", meta = (ExcludeFromHash, DisplayPriority = 2))
	bool bDisableReselection = false;

	// This allows users to set if this animation is original only (no mirrored data), original and mirrored, or only the mirrored version of this animation.
	// It requires the mirror table to be set up in the database Schema.
	UPROPERTY(EditAnywhere, Category = "Settings", meta = (DisplayPriority = 3))
	EPoseSearchMirrorOption MirrorOption = EPoseSearchMirrorOption::UnmirroredOnly;

	// SynchronizeWithExternalDependency is true when this asset has been added via SynchronizeWithExternalDependencies.
	// To delete it, remove the PoseSearchBranchIn notify state
	UPROPERTY(VisibleAnywhere, Category = "Settings", meta = (DisplayPriority = 20))
	bool bSynchronizeWithExternalDependency = false;
#endif // WITH_EDITORONLY_DATA
};
C++

After creating a UPoseSearchDatabase, we can add FPoseSearchDatabaseAnimationAssetBase into it. a list inside the instance of UPoseSearchDatabase contains a array of FPoseSearchDatabaseAnimationAssetBase.

image-7

Detials of UPoseSearchDatabase, contains a list of FPoseSearchDatabaseAnimationAssetBase, a detial panel and a preview scene. The preview scene could preview multiple animatio asset at the same time.

image-8

After selecting a certain animation asset, you can edit the attributes, note that it is already a FPoseSearchDatabaseAnimationAsset, not a AnimSequence.

The simplest animation asset is FPoseSearchSequence, which is correspounding to the UAnimSequence, which is easy to learn how to implement it:

PoseSearchDatabase.h
/** A sequence entry in a UPoseSearchDatabase. */
USTRUCT(BlueprintType, Category = "Animation|Pose Search")
struct POSESEARCH_API FPoseSearchDatabaseSequence : public FPoseSearchDatabaseAnimationAssetBase
{
	GENERATED_BODY()
	virtual ~FPoseSearchDatabaseSequence() = default;

	UPROPERTY(EditAnywhere, Category="Settings", meta = (DisplayPriority = 0))
	TObjectPtr<UAnimSequence> Sequence;

#if WITH_EDITORONLY_DATA
	// It allows users to set a time range to an individual animation sequence in the database. 
	// This is effectively trimming the beginning and end of the animation in the database (not in the original sequence).
	// If set to [0, 0] it will be the entire frame range of the original sequence.
	UPROPERTY(EditAnywhere, Category="Settings", meta = (DisplayPriority = 2))
	FFloatInterval SamplingRange = FFloatInterval(0.f, 0.f);

	virtual UClass* GetAnimationAssetStaticClass() const override;
	virtual bool IsLooping() const override;
	virtual const FString GetName() const override;
	virtual bool IsRootMotionEnabled() const override;
	virtual FFloatInterval GetSamplingRange() const override { return SamplingRange; }
#endif // WITH_EDITORONLY_DATA
	
	virtual UObject* GetAnimationAsset() const override;
};
C++
PoseSearchDatabse.cpp
//////////////////////////////////////////////////////////////////////////
// FPoseSearchDatabaseSequence
UObject* FPoseSearchDatabaseSequence::GetAnimationAsset() const
{
	return Sequence.Get();
}

#if WITH_EDITORONLY_DATA
UClass* FPoseSearchDatabaseSequence::GetAnimationAssetStaticClass() const
{
	return UAnimSequence::StaticClass();
}

bool FPoseSearchDatabaseSequence::IsLooping() const
{
	return Sequence &&
		Sequence->bLoop &&
		SamplingRange.Min == 0.f &&
		SamplingRange.Max == 0.f;
}

const FString FPoseSearchDatabaseSequence::GetName() const
{
	return Sequence ? Sequence->GetName() : FString();
}

bool FPoseSearchDatabaseSequence::IsRootMotionEnabled() const
{
	return Sequence ? Sequence->HasRootMotion() : false;
}
#endif // WITH_EDITORONLY_DATA

//////////////////////////////////////////////////////////////////////////
C++

(2). FSearchIndex

A UPoseSearchDatabase is used for indexing the animations and searching the best result of give schema. At the same time, the database also defines some attributes for searching. the usage of the attributes are described in the comments.

PoseSearchDatabase.h
/** A data asset for indexing a collection of animation sequences. */
UCLASS(BlueprintType, Category = "Animation|Pose Search", meta = (DisplayName = "Pose Search Database"))
class POSESEARCH_API UPoseSearchDatabase : public UDataAsset
{
	GENERATED_BODY()
public:

	// The Schema sets what channels this database will use to match against (bones, trajectory and what properties of those you’re interested in, such as position and velocity).
	UPROPERTY(EditAnywhere, BlueprintReadOnly, Category="Database")
	TObjectPtr<const UPoseSearchSchema> Schema;

	// Cost added to the continuing pose from this database. This allows users to apply a cost bias (positive or negative) to the continuing pose.
	// This is useful to help the system stay in one animation segment longer, or shorter depending on how you set this bias.
	// Negative values make it more likely to be picked, or stayed in, positive values make it less likely to be picked or stay in.
	// Note: excluded from DDC hash, since used only at runtime in SearchContinuingPose
	UPROPERTY(EditAnywhere, Category = "Database", meta = (ExcludeFromHash)) 
	float ContinuingPoseCostBias = -0.01f;

	// Base Cost added or removed to all poses from this database. It can be overridden by Anim Notify: Pose Search Modify Cost at the frame level of animation data.
	// Negative values make it more likely to be picked, or stayed in, Positive values make it less likely to be picked or stay in.
	UPROPERTY(EditAnywhere, Category = "Database")
	float BaseCostBias = 0.f;

	// Cost added to all looping animation assets in this database. This allows users to make it more or less likely to pick the looping animation segments.
	// Negative values make it more likely to be picked, or stayed in, Positive values make it less likely to be picked or stay in.
	UPROPERTY(EditAnywhere, Category = "Database")
	float LoopingCostBias = -0.005f;

#if WITH_EDITORONLY_DATA
	// These settings allow users to trim the start and end of animations in the database to preserve start/end frames for blending, and prevent the system from selecting the very last frames before it blends out.
	// valid animation frames will be AnimationAssetTimeStart + ExcludeFromDatabaseParameters.Min, AnimationAssetTimeEnd + ExcludeFromDatabaseParameters.Max
	UPROPERTY(EditAnywhere, Category = "Database", meta = (AllowInvertedInterval))
	FFloatInterval ExcludeFromDatabaseParameters = FFloatInterval(0.f, -0.3f);
	
....
....
....
C++

It holds a FSearchIndex, which is used to store the calcualted runtime data the animation frames. As I said, for a specified schema, we can pre calculate the feature vector of each frame in the animatio database. The database holds a schema, so the system pre calculate the result, and stores them into a FSearchIndex.

PoseSearchDatabase.h
...

private:
	// Do not use it directly. Use GetSearchIndex / SetSearchIndex interact with it and validate that is ok to do so.
	UE::PoseSearch::FSearchIndex SearchIndexPrivate;

#if WITH_EDITOR
	DECLARE_MULTICAST_DELEGATE(FOnDerivedDataRebuildMulticaster);
	FOnDerivedDataRebuildMulticaster OnDerivedDataRebuild;

	DECLARE_MULTICAST_DELEGATE(FOnSynchronizeWithExternalDependenciesMulticaster);
	FOnSynchronizeWithExternalDependenciesMulticaster OnSynchronizeWithExternalDependencies;
#endif // WITH_EDITOR

...
C++

When we save a certain UPoseSearchDatabase asset in Unreal Engine, we will finally trigger the function PostSaveRoot. It is an interface from UObject. Here we called FAsyncPoseSearchDatabasesManagement::RequestAsyncBuildIndex . Which starts the process of indexing the animations.

C++
void UPoseSearchDatabase::PostSaveRoot(FObjectPostSaveRootContext ObjectSaveContext)
{
#if WITH_EDITOR
	using namespace UE::PoseSearch;
	if (!IsTemplate() && !ObjectSaveContext.IsProceduralSave())
	{
		FAsyncPoseSearchDatabasesManagement::RequestAsyncBuildIndex(this, ERequestAsyncBuildFlag::NewRequest | ERequestAsyncBuildFlag::WaitForCompletion);
	}
#endif

	Super::PostSaveRoot(ObjectSaveContext);
}
C++

This starts a crazy process with thousands of lines of code, to keep this post simple so that we can focus on the core algorithm of Motion Matching, I made a simple diagram for you to understand this process in brief.

image-9

Process of indexing animations

So now you know, the process finally calls the method UPoseSearchFeatureChannel::IndexAsset, we take UPoseSearchFeatureChannel_Velocity as an example again:

PoseSearchChannel_Velocity.cpp
bool UPoseSearchFeatureChannel_Velocity::IndexAsset(UE::PoseSearch::FAssetIndexer& Indexer) const
{
	using namespace UE::PoseSearch;

	FVector LinearVelocity;
	for (int32 SampleIdx = Indexer.GetBeginSampleIdx(); SampleIdx != Indexer.GetEndSampleIdx(); ++SampleIdx)
	{
		if (Indexer.GetSampleVelocity(LinearVelocity, SampleTimeOffset, OriginTimeOffset, SampleIdx, SchemaBoneIdx, SchemaOriginBoneIdx, SampleRole, OriginRole, bUseCharacterSpaceVelocities, PermutationTimeType, SamplingAttributeId))
		{
			if (bNormalize)
			{
				LinearVelocity = LinearVelocity.GetClampedToMaxSize(1.f);
			}
			FFeatureVectorHelper::EncodeVector(Indexer.GetPoseVector(SampleIdx), ChannelDataOffset, LinearVelocity, ComponentStripping, false);
		}
		else
		{
			return false;
		}
	}
	return true;
}
C++

After doing this now we can get the feature vector of an animatio frame quickly by:

SearchIndex.cpp
TConstArrayView<float> GetPoseValuesBase(int32 PoseIdx, int32 DataCardinality) const
	{
		check(!IsValuesEmpty() && PoseIdx >= 0 && PoseIdx < GetNumPoses());
		check(Values.Num() % DataCardinality == 0);
		const int32 ValueOffset = PoseMetadata[PoseIdx].GetValueOffset();
		return MakeArrayView(&Values[ValueOffset], DataCardinality);
	}

	TConstArrayView<float> GetValuesVector(int32 ValuesVectorIdx, int32 DataCardinality) const
	{
		check(!IsValuesEmpty() && ValuesVectorIdx >= 0 && ValuesVectorIdx < GetNumValuesVectors(DataCardinality));
		const int32 ValueOffset = ValuesVectorIdx * DataCardinality;
		return MakeArrayView(&Values[ValueOffset], DataCardinality);
	}
C++

3 Core Algorithm

Finally the core alorithm, Let's focus on the animation node FAnimNode_MotionMatching, the core algorithm of motion matching is called in FAnimNode_MotionMatching::UpdateAssetPlayer(const FAnimationUpdateContext& Context). An animation node of motion matching holdes a database and a state struct, The latter one stores the essential data of the input and the output of motion matching.

AnimationNode_MotionMatching.h
...
// Encapsulated motion matching algorithm and internal state
FMotionMatchingState MotionMatchingState;

// Update Counter for detecting being relevant
FGraphTraversalCounter UpdateCounter;

// List of databases this node is searching.
UPROPERTY()
TArray<TObjectPtr<const UPoseSearchDatabase>> DatabasesToSearch;
...
C++
PoseSearchLibrary.h
struct FMotionMatchingState
{
	// Reset the state to a default state using the current Database
	void Reset(const FTransform& ComponentTransform);

	// Attempts to set the internal state to match the provided asset time including updating the internal DbPoseIdx. 
	// If the provided asset time is out of bounds for the currently playing asset then this function will reset the 
	// state back to the default state.
	void AdjustAssetTime(float AssetTime);

	// Internally stores the 'jump' to a new pose/sequence index and asset time for evaluation
	void JumpToPose(const FAnimationUpdateContext& Context, const UE::PoseSearch::FSearchResult& Result, int32 MaxActiveBlends, float BlendTime);

	void UpdateWantedPlayRate(const UE::PoseSearch::FSearchContext& SearchContext, const FFloatInterval& PlayRate, float TrajectorySpeedMultiplier);

	FVector GetEstimatedFutureRootMotionVelocity() const;

	UE::PoseSearch::FSearchResult CurrentSearchResult;

	// Time since the last pose jump
	float ElapsedPoseSearchTime = 0.f;

	// wanted PlayRate to have the selected animation playing at the estimated requested speed from the query.
	float WantedPlayRate = 1.f;

	// true if a new animation has been selected
	bool bJumpedToPose = false;

	UE::PoseSearch::FPoseIndicesHistory PoseIndicesHistory;

	// Component delta yaw (also considered as root bone delta yaw)
	float ComponentDeltaYaw = 0.f;

	// Internal component yaw in world space. Initialized as FRotator(AnimInstanceProxy->GetComponentTransform().GetRotation()).Yaw, but then integrated by ComponentDeltaYaw
	float ComponentWorldYaw = 0.f;
	
	// RootMotionTransformDelta yaw at the end of FAnimNode_MotionMatching::Evaluate_AnyThread (it represents the previous frame animation delta yaw)
	float AnimationDeltaYaw = 0.f;

#if UE_POSE_SEARCH_TRACE_ENABLED
	// Root motion delta for currently playing animation (or animation tree if from the blend stack)
	FTransform RootMotionTransformDelta = FTransform::Identity;
#endif //UE_POSE_SEARCH_TRACE_ENABLED
};
C++

Let's take a look at the core function of this node, note that the debug code and editor-only code are blurred:

AnimationNode_MotionMatching.cpp
void FAnimNode_MotionMatching::UpdateAssetPlayer(const FAnimationUpdateContext& Context)
{
	DECLARE_SCOPE_HIERARCHICAL_COUNTER_ANIMNODE(UpdateAssetPlayer);

	QUICK_SCOPE_CYCLE_COUNTER(STAT_PoseSearch_UpdateAssetPlayer);

	using namespace UE::PoseSearch;

	GetEvaluateGraphExposedInputs().Execute(Context);

	bool bNeedsReset =
		bResetOnBecomingRelevant &&
		UpdateCounter.HasEverBeenUpdated() &&
		!UpdateCounter.WasSynchronizedCounter(Context.AnimInstanceProxy->GetUpdateCounter());

#if WITH_EDITOR
	if (EAsyncBuildIndexResult::Success != FAsyncPoseSearchDatabasesManagement::RequestAsyncBuildIndex(MotionMatchingState.CurrentSearchResult.Database.Get(), ERequestAsyncBuildFlag::ContinueRequest))
	{
		bNeedsReset = true;
	}
	// in case this node is not updated, and MotionMatchingState.CurrentSearchResult.Database gets modified, we could end up with CurrentSearchResult being out of synch with the updated database, so we need to reset the state
	else if (MotionMatchingState.CurrentSearchResult.IsValid() && MotionMatchingState.CurrentSearchResult.PoseIdx >= MotionMatchingState.CurrentSearchResult.Database->GetSearchIndex().GetNumPoses())
	{
		bNeedsReset = true;
	}
#endif // WITH_EDITOR

	// If we just became relevant and haven't been initialized yet, then reset motion matching state, otherwise update the asset time using the player node.
	if (bNeedsReset)
	{
		MotionMatchingState.Reset(Context.AnimInstanceProxy->GetComponentTransform());
		FAnimNode_BlendStack_Standalone::Reset();
	}
	else
	{
		// We adjust the motion matching state asset time to the current player node's asset time. This is done 
		// because the player node may have ticked more or less time than we expected due to variable dt or the 
		// dynamic playback rate adjustment and as such the motion matching state does not update by itself
		MotionMatchingState.AdjustAssetTime(GetAccumulatedTime());
	}
	UpdateCounter.SynchronizeWith(Context.AnimInstanceProxy->GetUpdateCounter());

	// If the Database property hasn't been overridden, set it as the only database to search.
	if (!bOverrideDatabaseInput && Database)
	{
		DatabasesToSearch.Reset(1);
		DatabasesToSearch.Add(Database);
	}

#if ENABLE_ANIM_DEBUG
	if (CVarAnimNodeMotionMatchingDrawInfo.GetValueOnAnyThread())
	{
		const UPoseSearchDatabase* CurrentDatabase = MotionMatchingState.CurrentSearchResult.Database.Get();
		const UAnimationAsset* CurrentAnimationAsset = AnimPlayers.IsEmpty() ? nullptr : AnimPlayers[0].GetAnimationAsset();

		FString DebugInfo = FString::Printf(TEXT("NextUpdateInterruptMode(%s)\n"), *UEnum::GetValueAsString(NextUpdateInterruptMode));
		DebugInfo += FString::Printf(TEXT("Current Database(%s)\n"), *GetNameSafe(CurrentDatabase));
		DebugInfo += FString::Printf(TEXT("Current Asset(%s)\n"), *GetNameSafe(CurrentAnimationAsset));
		if (CVarAnimNodeMotionMatchingDrawInfoVerbose.GetValueOnAnyThread())
		{
			DebugInfo += FString::Printf(TEXT("Databases to search:\n"));
			for (const UPoseSearchDatabase* DatabaseToSearch : DatabasesToSearch)
			{
				DebugInfo += FString::Printf(TEXT("  %s\n"), *GetNameSafe(DatabaseToSearch));
			}
			DebugInfo += FString::Printf(TEXT("Blend Stack:\n"));
			for (const FBlendStackAnimPlayer& AnimPlayer : AnimPlayers)
			{
				DebugInfo += FString::Printf(TEXT("  %s [time:%.2f|playrate:%.2f]\n"), *GetNameSafe(AnimPlayer.GetAnimationAsset()), AnimPlayer.GetAccumulatedTime(), AnimPlayer.GetPlayRate());
			}
		}
		Context.AnimInstanceProxy->AnimDrawDebugInWorldMessage(DebugInfo, FVector::UpVector * CVarAnimNodeMotionMatchingDrawInfoHeight.GetValueOnAnyThread(), FColor::Yellow, 1.f /*TextScale*/);
	}
#endif // ENABLE_ANIM_DEBUG

	// Execute core motion matching algorithm
	UPoseSearchLibrary::UpdateMotionMatchingState(
		Context,
		DatabasesToSearch,
		BlendTime,
		MaxActiveBlends,
		PoseJumpThresholdTime,
		PoseReselectHistory,
		SearchThrottleTime,
		PlayRate,
		MotionMatchingState,
		NextUpdateInterruptMode,
		bShouldSearch,
		bShouldUseCachedChannelData
		#if ENABLE_ANIM_DEBUG
		, CVarAnimNodeMotionMatchingDrawQuery.GetValueOnAnyThread()
		, CVarAnimNodeMotionMatchingDrawCurResult.GetValueOnAnyThread()
		#endif // ENABLE_ANIM_DEBUG
	);

	UE::Anim::FNodeFunctionCaller::CallFunction(GetOnUpdateMotionMatchingStateFunction(), Context, *this);

	// If a new pose is requested, blend into the new asset via BlendStackNode
	if (MotionMatchingState.bJumpedToPose)
	{
		const FSearchIndexAsset* SearchIndexAsset = MotionMatchingState.CurrentSearchResult.GetSearchIndexAsset();
		const UPoseSearchDatabase* CurrentResultDatabase = MotionMatchingState.CurrentSearchResult.Database.Get();
		if (SearchIndexAsset && CurrentResultDatabase && CurrentResultDatabase->Schema)
		{
			const FPoseSearchDatabaseAnimationAssetBase* DatabaseAsset = CurrentResultDatabase->GetAnimationAssetBase(*SearchIndexAsset);
			check(DatabaseAsset);

			if (UAnimationAsset* AnimationAsset = Cast<UAnimationAsset>(DatabaseAsset->GetAnimationAsset()))
			{
				FAnimNode_BlendStack_Standalone::BlendTo(Context, AnimationAsset, MotionMatchingState.CurrentSearchResult.AssetTime,
					SearchIndexAsset->IsLooping(), SearchIndexAsset->IsMirrored(), CurrentResultDatabase->Schema->GetMirrorDataTable(DefaultRole), BlendTime,
					BlendProfile, BlendOption, bUseInertialBlend, SearchIndexAsset->GetBlendParameters(), MotionMatchingState.WantedPlayRate);
			}
			else
			{
				checkNoEntry();
			}
		}
	}

	const bool bDidBlendToRequestAnInertialBlend = MotionMatchingState.bJumpedToPose && bUseInertialBlend;
	UE::Anim::TOptionalScopedGraphMessage<UE::Anim::FAnimInertializationSyncScope> InertializationSync(bDidBlendToRequestAnInertialBlend, Context);
	
	FAnimNode_BlendStack_Standalone::UpdatePlayRate(MotionMatchingState.WantedPlayRate);
	FAnimNode_BlendStack_Standalone::UpdateAssetPlayer(Context);

	NextUpdateInterruptMode = EPoseSearchInterruptMode::DoNotInterrupt;
}
C++

First of all, it adjusts the asset time of the Motion Maching state, which allows the animation node to play the proper frame. And then it calls the method

void UPoseSearchLibrary::UpdateMotionMatchingState(
const FAnimationUpdateContext& Context,
const TArray<TObjectPtr<const UPoseSearchDatabase>>& Databases,
float BlendTime,
int32 MaxActiveBlends,
const FFloatInterval& PoseJumpThresholdTime,
float PoseReselectHistory,
float SearchThrottleTime,
const FFloatInterval& PlayRate,
FMotionMatchingState& InOutMotionMatchingState,
EPoseSearchInterruptMode InterruptMode,
bool bShouldSearch,
bool bShouldUseCachedChannelData,
bool bDebugDrawQuery,
bool bDebugDrawCurResult)

And after this the node just simple update the data the result. so we focus on the core mothd, this method starts the core algorithem of motion matching. It is also a complicated method, so I totally removed debugging code, logging code and editor-time code here:

PoseSearchLibrary.cpp
void UPoseSearchLibrary::UpdateMotionMatchingState(
	const FAnimationUpdateContext& Context,
	const TArray<TObjectPtr<const UPoseSearchDatabase>>& Databases,
	float BlendTime,
	int32 MaxActiveBlends,
	const FFloatInterval& PoseJumpThresholdTime,
	float PoseReselectHistory,
	float SearchThrottleTime,
	const FFloatInterval& PlayRate,
	FMotionMatchingState& InOutMotionMatchingState,
	EPoseSearchInterruptMode InterruptMode,
	bool bShouldSearch,
	bool bShouldUseCachedChannelData,
	bool bDebugDrawQuery,
	bool bDebugDrawCurResult)
{
	QUICK_SCOPE_CYCLE_COUNTER(STAT_PoseSearch_Update);

	using namespace UE::PoseSearch;

	check(Context.AnimInstanceProxy);

	if (Databases.IsEmpty())
	{
		Context.LogMessage(
			EMessageSeverity::Error,
			LOCTEXT("NoDatabases", "No database assets provided for motion matching."));
		return;
	}

	const float DeltaTime = Context.GetDeltaTime();

	InOutMotionMatchingState.bJumpedToPose = false;

	// used when YawFromAnimationBlendRate is greater than zero, by setting a future (YawFromAnimationTrajectoryBlendTime seconds ahead) root bone to the skeleton default
	const IPoseHistory* PoseHistory = nullptr;
	if (IPoseHistoryProvider* PoseHistoryProvider = Context.GetMessage<IPoseHistoryProvider>())
	{
		PoseHistory = &PoseHistoryProvider->GetPoseHistory();
	}

	FMemMark Mark(FMemStack::Get());
	const UAnimInstance* AnimInstance = Cast<const UAnimInstance>(Context.AnimInstanceProxy->GetAnimInstanceObject());
	check(AnimInstance);

	const UPoseSearchDatabase* CurrentResultDatabase = InOutMotionMatchingState.CurrentSearchResult.Database.Get();
	if (IsInvalidatingContinuingPose(InterruptMode, CurrentResultDatabase, Databases))
	{
		InOutMotionMatchingState.CurrentSearchResult.Reset();
	}

	FSearchContext SearchContext(0.f, &InOutMotionMatchingState.PoseIndicesHistory, InOutMotionMatchingState.CurrentSearchResult, PoseJumpThresholdTime);
	SearchContext.AddRole(DefaultRole, AnimInstance, PoseHistory);

	const bool bCanAdvance = InOutMotionMatchingState.CurrentSearchResult.CanAdvance(DeltaTime);

	// If we can't advance or enough time has elapsed since the last pose jump then search
	const bool bSearch = !bCanAdvance || (bShouldSearch && (InOutMotionMatchingState.ElapsedPoseSearchTime >= SearchThrottleTime));
	if (bSearch)
	{
		InOutMotionMatchingState.ElapsedPoseSearchTime = 0.f;
		const bool bForceInterrupt = IsForceInterrupt(InterruptMode, CurrentResultDatabase, Databases);
		const bool bSearchContinuingPose = !bForceInterrupt && bCanAdvance;

		// calculating if it's worth bUseCachedChannelData (if we potentially have to build query with multiple schemas)
		SearchContext.SetUseCachedChannelData(bShouldUseCachedChannelData && ShouldUseCachedChannelData(bSearchContinuingPose ? CurrentResultDatabase : nullptr, Databases));

		FSearchResult SearchResult;
		// Evaluate continuing pose
		if (bSearchContinuingPose)
		{
			SearchResult = CurrentResultDatabase->SearchContinuingPose(SearchContext);
			SearchContext.UpdateCurrentBestCost(SearchResult.PoseCost);
		}

		bool bJumpToPose = false;
		for (const TObjectPtr<const UPoseSearchDatabase>& Database : Databases)
		{
			if (ensure(Database))
			{
				FSearchResult NewSearchResult = Database->Search(SearchContext);
				if (NewSearchResult.PoseCost.GetTotalCost() < SearchResult.PoseCost.GetTotalCost())
				{
					bJumpToPose = true;
					SearchResult = NewSearchResult;
					SearchContext.UpdateCurrentBestCost(SearchResult.PoseCost);
				}
			}
		}
		if (bJumpToPose)
		{
			InOutMotionMatchingState.JumpToPose(Context, SearchResult, MaxActiveBlends, BlendTime);
		}
		else
		{
			// copying few properties of SearchResult into CurrentSearchResult to facilitate debug drawing
			InOutMotionMatchingState.CurrentSearchResult.PoseCost = SearchResult.PoseCost;
		}
	}
	else
	{
		InOutMotionMatchingState.ElapsedPoseSearchTime += DeltaTime;
	}

	// @todo: consider moving this into if (bSearch) to avoid calling SearchContext.GetCachedQuery if no search is required
	InOutMotionMatchingState.UpdateWantedPlayRate(SearchContext, PlayRate, PoseHistory ? PoseHistory->GetTrajectorySpeedMultiplier() : 1.f);

	InOutMotionMatchingState.PoseIndicesHistory.Update(InOutMotionMatchingState.CurrentSearchResult, DeltaTime, PoseReselectHistory);
}
C++

It begins with get the pose history and create a FPoseSearchContext. The pose search history, also known as the pose trajectory requires an animation node to implement. If you just watched some tutorials with old version Unreal Engine you may find that they are using a MotionTrajectory component, it is now not necessary in Pose Search, the pose history node is individual from the plugin MotionTrajectory now:

Using Motion Matching in Animation Blueprint

the implementation of the node Pose History is given in the appendix of this post on page 2.

Then, the alogithm comes to phase2: Search the best result from the database. There are 2 different methods here, SearchContinuingPose and Search. The formmer one requires the animation can be advanced and give an advantage of continuing animations as we setup in PoseSearchDatabase.

PoseSearchLibrary.cpp
...
...

const bool bCanAdvance = InOutMotionMatchingState.CurrentSearchResult.CanAdvance(DeltaTime);

// If we can't advance or enough time has elapsed since the last pose jump then search
const bool bSearch = !bCanAdvance || (bShouldSearch && (InOutMotionMatchingState.ElapsedPoseSearchTime >= SearchThrottleTime));
if (bSearch)
{
	InOutMotionMatchingState.ElapsedPoseSearchTime = 0.f;
	const bool bForceInterrupt = IsForceInterrupt(InterruptMode, CurrentResultDatabase, Databases);
	const bool bSearchContinuingPose = !bForceInterrupt && bCanAdvance;

	// calculating if it's worth bUseCachedChannelData (if we potentially have to build query with multiple schemas)
	SearchContext.SetUseCachedChannelData(bShouldUseCachedChannelData && ShouldUseCachedChannelData(bSearchContinuingPose ? CurrentResultDatabase : nullptr, Databases));

	FSearchResult SearchResult;
	// Evaluate continuing pose
	if (bSearchContinuingPose)
	{
		SearchResult = CurrentResultDatabase->SearchContinuingPose(SearchContext);
		SearchContext.UpdateCurrentBestCost(SearchResult.PoseCost);
	}
	
...
...
C++
PoseSearchDatabase.cpp
UE::PoseSearch::FSearchResult UPoseSearchDatabase::SearchContinuingPose(UE::PoseSearch::FSearchContext& SearchContext) const
{
	QUICK_SCOPE_CYCLE_COUNTER(STAT_PoseSearch_ContinuingPose);

	using namespace UE::PoseSearch;

	check(SearchContext.GetCurrentResult().Database.Get() == this);

	FSearchResult Result;
	Result.bIsContinuingPoseSearch = true;

#if WITH_EDITOR
	if (EAsyncBuildIndexResult::Success != FAsyncPoseSearchDatabasesManagement::RequestAsyncBuildIndex(this, ERequestAsyncBuildFlag::ContinueRequest))
	{
		SearchContext.SetAsyncBuildIndexInProgress();
		return Result;
	}
#endif // WITH_EDITOR

	// extracting notifies from the database animation asset at time SampleTime to search for UAnimNotifyState_PoseSearchOverrideContinuingPoseCostBias eventually overriding the database ContinuingPoseCostBias
	const FSearchIndex& SearchIndex = GetSearchIndex();
	const int32 PoseIdx = SearchContext.GetCurrentResult().PoseIdx;
	const FSearchIndexAsset& SearchIndexAsset = SearchIndex.GetAssetForPose(PoseIdx);
	const FPoseSearchDatabaseAnimationAssetBase* DatabaseAnimationAssetBase = GetAnimationAssetStruct(SearchIndexAsset).GetPtr<FPoseSearchDatabaseAnimationAssetBase>();
	check(DatabaseAnimationAssetBase);
	const UAnimationAsset* AnimationAsset = CastChecked<UAnimationAsset>(DatabaseAnimationAssetBase->GetAnimationAsset());
	
	// sampler used only to extract the notify states. RootTransformOrigin can be set as Identity, since will not be relevant
	const FAnimationAssetSampler SequenceBaseSampler(AnimationAsset, FTransform::Identity, SearchIndexAsset.GetBlendParameters());
	const float SampleTime = GetRealAssetTime(PoseIdx);

	float UpdatedContinuingPoseCostBias = ContinuingPoseCostBias;
	SequenceBaseSampler.ExtractPoseSearchNotifyStates(SampleTime, [&UpdatedContinuingPoseCostBias](const UAnimNotifyState_PoseSearchBase* PoseSearchNotify)
		{
			if (const UAnimNotifyState_PoseSearchOverrideContinuingPoseCostBias* ContinuingPoseCostBiasNotify = Cast<const UAnimNotifyState_PoseSearchOverrideContinuingPoseCostBias>(PoseSearchNotify))
			{
				UpdatedContinuingPoseCostBias = ContinuingPoseCostBiasNotify->CostAddend;
				return false;
			}
			return true;
		});

	// since any PoseCost calculated here is at least SearchIndex.MinCostAddend + UpdatedContinuingPoseCostBias,
	// there's no point in performing the search if CurrentBestTotalCost is already better than that
	if (!GetSkipSearchIfPossible() || SearchContext.GetCurrentBestTotalCost() > SearchIndex.MinCostAddend + UpdatedContinuingPoseCostBias)
	{
		const int32 NumDimensions = Schema->SchemaCardinality;
		// FMemory_Alloca is forced 16 bytes aligned
		TArrayView<float> ReconstructedPoseValuesBuffer((float*)FMemory_Alloca(NumDimensions * sizeof(float)), NumDimensions);
		check(IsAligned(ReconstructedPoseValuesBuffer.GetData(), alignof(VectorRegister4Float)));
		const TConstArrayView<float> PoseValues = SearchIndex.IsValuesEmpty() ? SearchIndex.GetReconstructedPoseValues(PoseIdx, ReconstructedPoseValuesBuffer) : SearchIndex.GetPoseValues(PoseIdx);

		const int32 ContinuingPoseIdx = SearchContext.GetCurrentResult().PoseIdx;
		// is the data padded at 16 bytes (and 16 bytes aligned by construction)?
		if (NumDimensions % 4 == 0)
		{
			Result.PoseCost = SearchIndex.CompareAlignedPoses(ContinuingPoseIdx, UpdatedContinuingPoseCostBias, PoseValues, SearchContext.GetOrBuildQuery(Schema));
		}
		// data is not 16 bytes padded
		else
		{
			Result.PoseCost = SearchIndex.ComparePoses(ContinuingPoseIdx, UpdatedContinuingPoseCostBias, PoseValues, SearchContext.GetOrBuildQuery(Schema));
		}

		Result.AssetTime = SearchContext.GetCurrentResult().AssetTime;
		Result.PoseIdx = PoseIdx;
		Result.Database = this;

#if UE_POSE_SEARCH_TRACE_ENABLED
		SearchContext.Track(this, ContinuingPoseIdx, EPoseCandidateFlags::Valid_ContinuingPose, Result.PoseCost);
#endif // UE_POSE_SEARCH_TRACE_ENABLED
	}

	return Result;
}
C++

the method SearchContinuingPose seems complicated but it actually does a simple thing: find the continuing pose and give it buff of reducing the cost. The method Search is much more complex, it searches the whole animation database and find best result. As we have already searched the continuing pose, so every search in the general searching has a disadvantage. This makes the system more likely to stay in one animation and avoid frequence changing of the animations(editor time and debugging code are removed):

PoseSearchDatabase.cpp
UE::PoseSearch::FSearchResult UPoseSearchDatabase::Search(UE::PoseSearch::FSearchContext& SearchContext) const
{
	using namespace UE::PoseSearch;

	FSearchResult Result;

	if (PoseSearchMode == EPoseSearchMode::BruteForce)
	{
		Result = SearchBruteForce(SearchContext);
	}

	if (PoseSearchMode == EPoseSearchMode::VPTree)
	{
		Result = SearchVPTree(SearchContext);

	}
	else if (PoseSearchMode == EPoseSearchMode::PCAKDTree)
	{
		Result = SearchPCAKDTree(SearchContext);
	}
	return Result;
}
C++

This methods choose the interval search method by user option. VPTree and PCA-KDTree and optimied method of SearchBrute. However, to ensure everyone can easily understand, we have chosen the slowest but simplest method for explanation. If you don't know what is VPTree and what is KDTree, I append some references in appendix on page 2.

PoseSearchDatabase.cpp
UE::PoseSearch::FSearchResult UPoseSearchDatabase::SearchBruteForce(UE::PoseSearch::FSearchContext& SearchContext) const
{
	SCOPE_CYCLE_COUNTER(STAT_PoseSearch_BruteForce);
	
	using namespace UE::PoseSearch;
	
	FSearchResult Result;

	const FSearchIndex& SearchIndex = GetSearchIndex();

	// since any PoseCost calculated here is at least SearchIndex.MinCostAddend,
	// there's no point in performing the search if CurrentBestTotalCost is already better than that
	if (!GetSkipSearchIfPossible() || SearchContext.GetCurrentBestTotalCost() > SearchIndex.MinCostAddend)
	{
		TConstArrayView<float> QueryValues = SearchContext.GetOrBuildQuery(Schema);

		FSelectableAssetIdx SelectableAssetIdx;
		PopulateSelectableAssetIdx(SelectableAssetIdx, SearchContext.GetAssetsToConsider(), this);

		FNonSelectableIdx NonSelectableIdx;
		PopulateNonSelectableIdx(NonSelectableIdx, SearchContext, this
#if UE_POSE_SEARCH_TRACE_ENABLED
			, QueryValues
#endif // UE_POSE_SEARCH_TRACE_ENABLED
		);

		const int32 NumDimensions = Schema->SchemaCardinality;
		const bool bUpdateBestCandidates = PoseSearchMode == EPoseSearchMode::BruteForce;

		const FSearchFilters SearchFilters(Schema, NonSelectableIdx, FSelectableAssetIdx(), SearchIndex.bAnyBlockTransition);

		if (SelectableAssetIdx.IsEmpty())
		{
			// do we need to reconstruct pose values?
			if (SearchIndex.IsValuesEmpty())
			{
				// FMemory_Alloca is forced 16 bytes aligned
				TArrayView<float> ReconstructedPoseValuesBuffer((float*)FMemory_Alloca(NumDimensions * sizeof(float)), NumDimensions);
				check(IsAligned(ReconstructedPoseValuesBuffer.GetData(), alignof(VectorRegister4Float)));
				for (int32 PoseIdx = 0; PoseIdx < SearchIndex.GetNumPoses(); ++PoseIdx)
				{
					EvaluatePoseKernel<true, false>(Result, SearchIndex, QueryValues, ReconstructedPoseValuesBuffer, PoseIdx, SearchFilters, SearchContext, this, bUpdateBestCandidates, PoseIdx);
				}
			}
			// is the data padded at 16 bytes (and 16 bytes aligned by construction)?
			else if (NumDimensions % 4 == 0)
			{
				for (int32 PoseIdx = 0; PoseIdx < SearchIndex.GetNumPoses(); ++PoseIdx)
				{
					EvaluatePoseKernel<false, true>(Result, SearchIndex, QueryValues, TArrayView<float>(), PoseIdx, SearchFilters, SearchContext, this, bUpdateBestCandidates, PoseIdx);
				}
			}
			// no reconstruction, but data is not 16 bytes padded
			else
			{
				for (int32 PoseIdx = 0; PoseIdx < SearchIndex.GetNumPoses(); ++PoseIdx)
				{
					EvaluatePoseKernel<false, false>(Result, SearchIndex, QueryValues, TArrayView<float>(), PoseIdx, SearchFilters, SearchContext, this, bUpdateBestCandidates, PoseIdx);
				}
			}
		}
		else
		{
			int32 ResultIndex = -1;

			// do we need to reconstruct pose values?
			if (SearchIndex.IsValuesEmpty())
			{
				// FMemory_Alloca is forced 16 bytes aligned
				TArrayView<float> ReconstructedPoseValuesBuffer((float*)FMemory_Alloca(NumDimensions * sizeof(float)), NumDimensions);
				check(IsAligned(ReconstructedPoseValuesBuffer.GetData(), alignof(VectorRegister4Float)));

				for (int32 AssetIdx : SelectableAssetIdx)
				{
					const FSearchIndexAsset& SearchIndexAsset = SearchIndex.Assets[AssetIdx];
					const int32 FirstPoseIdx = SearchIndexAsset.GetFirstPoseIdx();
					const int32 LastPoseIdx = FirstPoseIdx + SearchIndexAsset.GetNumPoses();
					for (int32 PoseIdx = FirstPoseIdx; PoseIdx < LastPoseIdx; ++PoseIdx)
					{
						EvaluatePoseKernel<true, false>(Result, SearchIndex, QueryValues, ReconstructedPoseValuesBuffer, PoseIdx, SearchFilters, SearchContext, this, bUpdateBestCandidates, ++ResultIndex);
					}
				}
			}
			// is the data padded at 16 bytes (and 16 bytes aligned by construction)?
			else if (NumDimensions % 4 == 0)
			{
				for (int32 AssetIdx : SelectableAssetIdx)
				{
					const FSearchIndexAsset& SearchIndexAsset = SearchIndex.Assets[AssetIdx];
					const int32 FirstPoseIdx = SearchIndexAsset.GetFirstPoseIdx();
					const int32 LastPoseIdx = FirstPoseIdx + SearchIndexAsset.GetNumPoses();
					for (int32 PoseIdx = FirstPoseIdx; PoseIdx < LastPoseIdx; ++PoseIdx)
					{
						EvaluatePoseKernel<false, true>(Result, SearchIndex, QueryValues, TArrayView<float>(), PoseIdx, SearchFilters, SearchContext, this, bUpdateBestCandidates, ++ResultIndex);
					}
				}
			}
			// no reconstruction, but data is not 16 bytes padded
			else
			{
				for (int32 AssetIdx : SelectableAssetIdx)
				{
					const FSearchIndexAsset& SearchIndexAsset = SearchIndex.Assets[AssetIdx];
					const int32 FirstPoseIdx = SearchIndexAsset.GetFirstPoseIdx();
					const int32 LastPoseIdx = FirstPoseIdx + SearchIndexAsset.GetNumPoses();
					for (int32 PoseIdx = FirstPoseIdx; PoseIdx < LastPoseIdx; ++PoseIdx)
					{
						EvaluatePoseKernel<false, false>(Result, SearchIndex, QueryValues, TArrayView<float>(), PoseIdx, SearchFilters, SearchContext, this, bUpdateBestCandidates, ++ResultIndex);
					}
				}
			}
		}
	}
	else
	{
#if UE_POSE_SEARCH_TRACE_ENABLED
		// calling just for reporting non selectable poses
		TConstArrayView<float> QueryValues = SearchContext.GetOrBuildQuery(Schema);
		FNonSelectableIdx NonSelectableIdx;
		PopulateNonSelectableIdx(NonSelectableIdx, SearchContext, this, QueryValues);
#endif // UE_POSE_SEARCH_TRACE_ENABLED
	}

	// finalizing Result properties
	if (Result.PoseIdx != INDEX_NONE)
	{
		Result.AssetTime = GetNormalizedAssetTime(Result.PoseIdx);
		Result.Database = this;
	}

#if UE_POSE_SEARCH_TRACE_ENABLED
	Result.BruteForcePoseCost = Result.PoseCost; 
#endif // UE_POSE_SEARCH_TRACE_ENABLED

	return Result;
}
C++
PoseSearchDatabase.cpp
template<bool bReconstructPoseValues, bool bAlignedAndPadded>
static inline void EvaluatePoseKernel(UE::PoseSearch::FSearchResult& Result, const UE::PoseSearch::FSearchIndex& SearchIndex, TConstArrayView<float> QueryValues, TArrayView<float> ReconstructedPoseValuesBuffer,
	int32 PoseIdx, const UE::PoseSearch::FSearchFilters& SearchFilters, UE::PoseSearch::FSearchContext& SearchContext, const UPoseSearchDatabase* Database, bool bUpdateBestCandidates, int32 ResultIndex = -1)
{
	using namespace UE::PoseSearch;

	const TConstArrayView<float> PoseValues = bReconstructPoseValues ? SearchIndex.GetReconstructedPoseValues(PoseIdx, ReconstructedPoseValuesBuffer) : SearchIndex.GetPoseValues(PoseIdx);

	if (SearchFilters.AreFiltersValid(SearchIndex, PoseValues, QueryValues, PoseIdx
#if UE_POSE_SEARCH_TRACE_ENABLED
		, SearchContext, Database
#endif // UE_POSE_SEARCH_TRACE_ENABLED
	))
	{
		const FPoseSearchCost PoseCost = bAlignedAndPadded ? SearchIndex.CompareAlignedPoses(PoseIdx, 0.f, PoseValues, QueryValues) : SearchIndex.ComparePoses(PoseIdx, 0.f, PoseValues, QueryValues);
		if (PoseCost < Result.PoseCost)
		{
			Result.PoseCost = PoseCost;
			Result.PoseIdx = PoseIdx;

#if UE_POSE_SEARCH_TRACE_ENABLED
			if (bUpdateBestCandidates)
			{
				Result.BestPosePos = ResultIndex;
			}
#endif // UE_POSE_SEARCH_TRACE_ENABLED
		}

#if UE_POSE_SEARCH_TRACE_ENABLED
		if (bUpdateBestCandidates)
		{
			SearchContext.Track(Database, PoseIdx, EPoseCandidateFlags::Valid_Pose, PoseCost);
		}
#endif // UE_POSE_SEARCH_TRACE_ENABLED
	}
}
C++

The code is complex, but to put it simply, this process can be summarized as follows. First, we will identify which animations can be selected and which ones can never be chosen, and prepare to separate the animations that cannot be filtered. Next, we will query the current character's state according to the definition of how to create feature vectors in the Schema, and construct the feature vector. Then, we will traverse all the animation frames in the SearchIndex and select the best result.

At this point, we have completed the motion matching for a single character. Returning to our calling stack, the remaining tasks are simply updating variables and data, then waiting for the AnimationNode to play the correct animation. The loop will continue in the next frame. This concludes the discussion on Motion Matching. You might feel that my explanation is too brief, and there are many other systems in this plugin, such as how to use the Mirror Animation Table and how to choose the rules for constructing feature vectors in a project. I will address these questions in the next article, which will be written after Epic releases their Motion Matching sample project in June...if they release it on time.

By JiahaoLi

Hypergryph - Game Programmer 2023 - Now Shandong University - Bachelor 2019-2023

2 thoughts on “Motion Matching in Unreal Engine 5(4) : What’new in Unreal Engine 5.4?”

Leave a Reply

Your email address will not be published. Required fields are marked *