You have spent time carefully tracking your footage in After Effects. The 3D Camera Tracker has done its job — the tracking points are clean, the solve looks good, and the camera moves exactly as you wanted. Now comes the question that trips up so many artists: how do you get all of that tracking data out of After Effects and into Blender so you can actually place and render your 3D objects on top of the footage?
The good news is that the Gachoki Studios workflow handles this elegantly — using a free After Effects script to export composition data as JSON, and a free Blender add-on to import it. The entire pipeline, from export to a live camera moving in Blender, takes about five minutes once you know the steps.
This guide walks you through the complete process, step by step, for the latest versions of After Effects and Blender 4.2 and above. It also covers the import options in detail — including ones that most tutorials skip entirely — so you understand exactly what each setting does and how to use it for your specific project.
Stuck at a particular step? Drop your question in the comments at the bottom of this page. This post is actively maintained, and every question gets answered.
What You Need Before You Start
- Adobe After Effects (any recent version — 2022 and later confirmed working)
- Blender 4.2 or newer (download from blender.org)
- The After Effects export script: Export_Composition_Data_to_JSON.jsx — download from https://drive.google.com/file/d/1aMn_qbSPYFx4_4138Pnr4t58Ts1s7gUu/view?usp=sharing
- The Blender import add-on: import-comp-to-blender.py — download from https://drive.google.com/file/d/1okNMnVYFChzW0yIyrrhmXOzKqp-9Mnyw/view?usp=sharing
- Your footage file accessible to both applications
A quick note on Blender versions: The add-on uses Blender’s newer action slot system, which changed in Blender 4.4. If you are on Blender 4.4 or newer and notice issues, the add-on includes a legacy compatibility path for older Blender versions and is updated to handle the API differences. Blender 4.2 is the version confirmed to work most reliably, and it is a Long-Term Support (LTS) release — so it is a safe choice for this workflow.
Step 1: Track Your Footage in After Effects
Open After Effects and import your footage. Drag it into the Project panel, then right-click and select New Composition from Footage to create a composition that exactly matches your clip’s dimensions and frame rate — this matters for the export later.
Once your composition is set up:
- In the Effects and Presets panel on the right side, search for 3D Camera Tracker
- Drag it onto your footage layer in the timeline
- After Effects will begin analyzing the footage automatically. You will see a banner in the viewer that says “Analyzing in background”
- In the Effect Controls panel, go to the Advanced section and make sure Detailed Analysis is enabled. This produces a more accurate solve, especially for footage with complex camera motion or limited feature points
Give the computer time to complete the analysis. Depending on your footage length and complexity, this can take anywhere from a few seconds to several minutes. The progress percentage is visible in the viewer.
Once the analysis is complete, After Effects will show coloured tracking markers scattered across your footage. The solve error value will also appear in the Effect Controls panel — a solve error below 1.0 pixel is generally considered good. If your error is higher, try enabling Detailed Analysis or reducing the footage length.
Pro tip: For best results, use footage that has clear, distinct features and contains some parallax — meaning the camera actually moves through 3D space rather than just panning or tilting in place. Footage with lots of motion blur or lens distortion will produce a less accurate solve.
Step 2: Create a Plane on the Spot Where You Will Place Your 3D Model
This step creates the reference plane in your After Effects composition that will become your placement anchor in Blender. Getting this right determines how accurately your 3D objects sit on or against surfaces in your footage.
- With the 3D Camera Tracker effect active on your footage layer, hover your cursor over the viewer — you will see colored dots appear as you hover near the tracking markers. These dots indicate which markers the tracker is grouping as a potential surface
- Hover over the area where you want to place your 3D model. Look for a set of dots that forms a relatively flat triangular plane indicator — this represents the tracked surface in that area
- Select at least three tracking points in the area you want to use. Hold Ctrl (Windows) / Cmd (Mac) to select additional points, or hold the left mouse button and drag to select multiple points at once
- Right-click on the selected markers and choose Create Solid and Camera
This creates two things in your composition timeline: a Track Solid (a coloured solid layer positioned in 3D space on the surface you selected) and a 3D Tracker Camera (a camera layer that carries the full tracked motion of the original footage camera). Both of these layers will be exported to Blender.
Why select three or more points? Three points define a plane. The more points you select from a consistent surface, the more stable your placement plane will be. If you notice the Track Solid drifting or rotating incorrectly when you scrub through the timeline, try re-creating it with different or additional tracking markers.
Step 3: Install the After Effects Export Script
The script — Export_Composition_Data_to_JSON.jsx — reads the selected layers from your composition and writes their animation data (position, rotation, scale, camera zoom, and more) into a JSON file that the Blender add-on can read.
To install the script:
- Download Export_Composition_Data_to_JSON.jsx from the link at the top of this article
- In After Effects, go to File → Scripts → Install Script File…
- Navigate to the downloaded .jsx file and select it
- You will see a confirmation message. Click OK
- Save your project, then quit and restart After Effects fully — the script will only appear in the Scripts menu after a restart
After restarting, the script will be available under File → Scripts → Export Composition Data to JSON.
Note: The script was created by the developer adroitwhiz and is released under the CPAL (Common Public Attribution License). It handles cameras, lights, and solid layers, exporting their transforms, keyframes (including Bezier easing data), and properties with full frame-by-frame accuracy. You do not need to do anything special to prepare your composition for the export — just make sure the layers you want to export are selected.
Step 4: Export Camera Tracking Data from After Effects
With the script installed and After Effects restarted:
- In your composition timeline, select both the 3D Tracker Camera layer and the Track Solid layer. You can click one, then Shift-click the other, or Ctrl/Cmd-click to select both
- Go to File → Scripts → Export Composition Data to JSON
- A dialog box will appear with export settings:
Time Range — check this box to export only the frames within your composition’s work area. This is important if your footage is long but you only tracked a portion of it. Make sure your work area in After Effects covers exactly the frames you want to transfer to Blender.
Export Selected Layers Only — check this box. This ensures only your camera and tracking solid are exported, not every layer in your composition (which could include background footage, adjustment layers, and other elements you do not need in Blender).
Bake Transforms — leave this unchecked in most cases. Baking flattens all transform data into frame-by-frame matrix data, which loses the Bezier easing from your keyframes. Leave it unchecked to preserve the smooth animation curves from After Effects. Only enable it if you have complex parenting relationships or expressions driving transforms that the script cannot otherwise capture accurately.
- Click Browse and navigate to a folder where you want to save the JSON file. Give it a descriptive name (for example, camera_tracking_data.json) and click Save
- Click Export
The script will process your selected layers and write the JSON file to the location you chose. The file is plain text — if you open it, you will see the composition settings (width, height, frame rate, work area), followed by the layer data for your camera and tracking solid, including all keyframes with their timing and easing information.
If the export appears to save as a .txt file instead of .json: Open the file in a text editor and use Save As to resave it with the .json extension. The content is JSON regardless of the extension — the Blender add-on reads the content, not the file extension.
Step 5: Install the Blender Import Add-On
The Blender add-on — import-comp-to-blender.py (also named __init__.py in its packaged form) — reads the JSON file exported from After Effects and reconstructs the camera and objects in Blender with matching animation keyframes.
To install it:
- Download the add-on file from the link at the top of this article
- Open Blender
- Go to Edit → Preferences → Add-ons
- Click Install (in Blender 4.2, this is a button at the top right of the Add-ons panel)
- Navigate to the downloaded .py file and select it
- Click Install Add-on
- The add-on will appear in the list. Check the checkbox next to it to activate it
- Click Save Preferences so it remains active in future Blender sessions
Once installed and activated, you will find the new import option under File → Import → After Effects composition data, converted (.json).
Blender 4.2 compatibility note: The add-on uses Blender’s action slot API, which introduced significant changes between Blender 4.2 and 4.4. The version of the add-on available on the Gachoki Studios page is the most current build — always use the latest version to ensure compatibility with your Blender version.
Step 6: Import the Camera Tracking Data into Blender
Before importing, prepare your Blender scene:
- Delete all default objects — select everything with A, then press X and confirm Delete. You want a completely empty scene so that only the imported camera and plane come in
- This avoids having the default camera conflict with the imported one
Now import:
- Go to File → Import → After Effects composition data, converted (.json)
- Navigate to the JSON file you exported from After Effects and select it
- Before clicking Import, look at the import options panel on the left side of the file browser (expand it if it is not visible). These settings are important:
Understanding the Import Options
Scale Factor — default is 0.01. This maps one After Effects pixel to one centimeter in Blender. After Effects works in pixel coordinates, which can be enormous numbers in Blender’s unit system. The 0.01 scale factor keeps things manageable. If your imported scene appears extremely tiny or enormous, adjust this value. For example, use 0.001 to make things ten times smaller, or 0.1 to make them ten times larger.
Handle FPS (Frame Rate) — this controls what happens when your After Effects composition frame rate differs from Blender’s scene frame rate. There are three options:
- Preserve Frame Numbers: Keeps keyframes at the same frame numbers without adjusting Blender’s frame rate. Use this if you want to manually set Blender’s frame rate and accept that timing may shift slightly.
- Use Comp Frame Rate: Automatically sets Blender’s scene frame rate to match the exported composition. This is the recommended option for most users — it ensures everything plays back at exactly the correct speed without any manual adjustments.
- Remap Frame Times: If Blender’s frame rate is already set differently, this remaps the keyframe timing to preserve the real-time speed of the animation. Useful if you need Blender at a specific frame rate (for example, 30fps) but your footage was tracked at 25fps.
Comp Center to Origin — when enabled, shifts everything so the composition’s center point (which in After Effects is the top-left corner of the frame, not the true center) aligns with Blender’s world origin. Leave this off unless you specifically need the composition centered at (0, 0, 0).
Use Comp Resolution — automatically sets Blender’s render resolution (X and Y) to match the imported composition’s dimensions. Enable this — it ensures your renders match the original footage size without manual adjustment. If your footage was vertical (for example, 1080×1920 for Instagram Stories), this will correctly set Blender’s resolution to match.
Create New Collection — places all imported objects into a new collection named after your After Effects composition. Useful for organization if you are importing into an existing scene. Leave off for a fresh scene.
Adjust Frame Start/End — automatically sets Blender’s timeline start and end frames to match the composition’s work area. Enable this to save having to manually adjust the timeline range.
Cameras to Markers — if your After Effects composition had multiple camera layers switching between them at different points, this option creates timeline markers in Blender bound to each camera at its in/out point. Leave off for single-camera setups (which is the standard camera tracking scenario).
- Click Import AE Comp
Blender will read the JSON file and create the camera and tracking solid as objects in your scene. If everything went correctly, you will see a camera and a flat plane object in the viewport. Press Numpad 0 to enter the camera view and you should see the camera positioned as it was in After Effects.
Step 7: Make Adjustments in Blender
Match the Frame Rate
Even with the Use Comp Frame Rate option enabled during import, it is worth double-checking. Go to Output Properties and confirm that the frame rate matches your footage. Common footage frame rates are 23.976, 24, 25, 29.97, and 30 fps. A mismatch here means the camera animation will play at the wrong speed relative to the footage.
Fix the Frame Offset
This is the most commonly missed adjustment. After Effects timelines start at frame 0, but Blender timelines start at frame 1. This means the imported camera keyframes will be offset by one frame — frame 0 in After Effects becomes frame 0 in Blender, but Blender’s playback starts at frame 1. If you do not fix this, the camera and footage will be out of sync by exactly one frame.
To fix it:
- In the Timeline or Dope Sheet, select all keyframes of the camera (click on the camera object, go to the Dope Sheet, select all with A)
- Press G to grab, then type 1 and press Enter to shift all keyframes one frame forward
- The first keyframe should now sit on frame 1
If the Adjust Frame Start/End import option was enabled, the work area is already accounted for. But the frame-zero-to-frame-one shift still needs to be applied manually.
Add the Footage as a Camera Background
To see your footage playing behind your 3D objects while working in Blender:
- Select the camera in the viewport
- Go to Object Data Properties (the camera icon in the Properties panel)
- Enable Background Images
- Click Add Image and set the source to Movie Clip
- Click Open and navigate to your original footage file
- Increase the Opacity slider to make the footage more visible (the default is quite dim)
Now when you press Numpad 0 to enter camera view and play the timeline, you will see the footage playing behind your scene. The imported plane should stay locked to the same surface in the footage — if the tracking was accurate, it will not drift.
Set Blender’s Render Resolution
If you did not enable Use Comp Resolution during import, set the resolution manually now. Go to Output Properties and set X and Y to match your footage:
- Standard 1080p horizontal: 1920 × 1080
- 4K horizontal: 3840 × 2160
- Vertical (Instagram Stories, TikTok): 1080 × 1920 — note that you set X to 1080 and Y to 1920
- Vertical 4K: 2160 × 3840
For non-square pixel footage, also check the Aspect X and Y values under the Format section and match them to your footage’s pixel aspect ratio.
Step 8: Place Your 3D Model and Composite the Result
With the camera moving correctly and the footage visible in the background, you are ready to add your 3D content:
- Snap your 3D cursor to the tracking plane: select the imported Track Solid plane, then press Shift+S → Cursor to Selected. This places the 3D cursor exactly where your tracking plane is in 3D space
- Add your 3D model: press Shift+A → Mesh (or whichever type of object you need) — it will appear at the cursor position, which is on the tracking plane
- Scale and position the model as needed. Because the camera has the correct real-world-derived motion from the tracker, your model will appear to stick to the surface as the camera moves — as long as you place it correctly in 3D space relative to the plane
You can delete the Track Solid plane once your model is positioned correctly. It was only needed as a spatial reference. Select it and press X to delete.
Set Up the Compositor for the Final Render
To composite your Blender render over the real footage:
- In Blender’s header, switch to the Compositing workspace (or enable Use Nodes in the Compositor)
- Add the following nodes with Shift+A:
- Input → Render Layers — your Blender 3D render
- Input → Movie Clip — your footage file (select your clip from the dropdown)
- Color → Alpha Over — combines the two
- Output → Composite — final output
- Optionally: Output → Viewer — lets you preview the composite in real time
- Connect them as follows:
- Movie Clip Image output → Alpha Over top Image input
- Render Layers Image output → Alpha Over bottom Image input
- Alpha Over Image output → Composite Image input and Viewer Image input
This places your footage as the background with your 3D render composited on top. Press F12 to render and check the result.
For a more realistic result, consider adding a Shadow Catcher — an invisible plane that receives shadows from your 3D model, which makes the model appear to be casting a shadow on the real surface. This single step dramatically improves how grounded and believable your composite looks.
Troubleshooting Common Issues
Camera is not animating / no keyframes after import Make sure both the 3D Tracker Camera and the Track Solid were selected in After Effects before running the export script. The most common cause of missing animation is accidentally running the script with only one layer selected, or with no layers selected at all. Also confirm that “Export Selected Layers Only” was checked in the export dialog.
The camera goes below the grid / objects appear underground This is expected. After Effects and Blender have different coordinate systems. The camera adapts the position and orientation from After Effects tracking coordinates, which often places objects underground relative to Blender’s world grid. Arrange your 3D models relative to the imported plane and camera path rather than relative to Blender’s grid.
Frame rate mismatch causing the camera to drift out of sync Go to Output Properties and set the frame rate to exactly match your footage. For 29.97fps footage, Blender has a NTSC preset (29.97). For 23.976fps footage, use the Film preset. Do not round 29.97 to 30 — even this small difference will cause noticeable drift over a 10-second clip.
Footage imported as vertical but showing horizontal in Blender Go to Output Properties → Format and set X to 1080 and Y to 1920 (for 1080×1920 vertical footage). Also check the Aspect X and Y values. If the footage still appears letterboxed, the issue is with the camera background image display — select the camera, go to Object Data Properties → Background Images, and check the display settings there.
Export file saves as .txt instead of .json The content of the file is still valid JSON. Open it in any text editor (Notepad, TextEdit, VS Code) and use Save As to resave it with a .json extension before importing into Blender.
Error message on import in Blender 4.4 or 4.5+ There was a breaking change in Blender’s animation action API between versions 4.2 and 4.4. The add-on includes both a modern slot-based path (for Blender 4.4+) and a legacy path (for earlier versions), but the two code paths must match your Blender version. Download the latest version of the add-on from the Gachoki Studios page — updates to support newer Blender versions are released as the API changes are confirmed.
The camera imports correctly but the tracking feels laggy or jittery compared to After Effects This can happen if there is a frame rate mismatch, or if the keyframe interpolation is not matching between the two applications. The export script preserves Bezier easing data from After Effects keyframes, but Blender’s interpolation engine may interpret the handles slightly differently. In the Dope Sheet, select all the camera keyframes, press T, and try switching from Bezier to Linear interpolation to see if the jitter improves. Alternatively, enable Bake Transforms in the After Effects export dialog to export a frame-by-frame matrix transform instead, which removes interpolation as a variable entirely.
What the Script and Add-On Actually Export and Import
For those who want to understand what is happening under the hood:
The After Effects script (Export_Composition_Data_to_JSON.jsx) reads the selected layers and exports:
- Composition metadata: width, height, pixel aspect ratio, frame rate, work area start and end times
- Per-layer data: name, type (camera, solid/null, light), in/out points, parent relationships, enabled state
- Transform properties: position, X/Y/Z rotation, orientation — with full keyframe data including Bezier easing (speed and influence values for handles)
- For cameras specifically: the zoom property (which controls focal length in After Effects) with keyframe data
- For cameras and lights: the Point of Interest property when applicable
The Blender add-on (__init__.py) takes this JSON and:
- Creates a Blender camera object and sets its sensor fit to Vertical (matching After Effects’ composition-height-based field of view calculation). The focal length is derived from the zoom value using the formula: lens = zoom * (24 / comp_height) — this correctly maps After Effects’ zoom values to Blender’s millimeter lens values based on the default 24mm sensor height
- Creates empty objects or mesh planes for solid/null layers
- Remaps After Effects’ left-handed coordinate system to Blender’s right-handed system by swizzling Y and Z axes and negating the Y scale
- Handles parent-child relationships between layers
- For cameras with a Point of Interest: creates a special constraint rig — a parent empty with a Track To constraint targeting the Point of Interest location — which accurately reproduces how After Effects cameras track their focus point
- Imports all keyframes as F-curves, preserving Bezier easing data from After Effects
- Sets scene resolution, pixel aspect ratio, and frame range if those import options are enabled
Understanding this helps when things go wrong — for example, knowing that the coordinate system is swizzled explains why objects imported from After Effects do not align to Blender’s world grid.
Going Further: What to Do After Compositing
Once your 3D object is tracked and compositing is working, here are the natural next steps for your project:
Add realistic shadows using a Shadow Catcher plane. This is a Blender-native feature that makes an invisible object receive and cast shadows without appearing in the render itself. Combined with your Alpha Over node in the compositor, it makes your 3D objects look like they are genuinely resting on the real-world surface. Check out our guide: Blender Shadow Catcher
Match lighting to your footage. The 3D camera tracker gives you the camera motion, but lighting your 3D model to match the real-world lighting in the footage is its own art. Using an HDRI that approximates the lighting conditions in your footage, or manually placing lights based on shadows visible in the footage, will make your composite significantly more convincing.
Add motion blur to your 3D objects. If your footage has motion blur (most real footage does), your 3D render should have it too. Enable motion blur in Render Properties to ensure your objects blur naturally as the camera or objects move.
Use the compositor for color grading. After you composite your 3D render over the footage, add a Color Balance or Curves node to match the color tone and contrast of your render to the footage. Footage has lens characteristics, color science, and grain that a clean 3D render does not — closing that gap in the compositor makes a huge difference.
Related Tutorials to Build Your VFX Workflow
- How to Eliminate Noise, Grain, and Fireflies From Blender Renders
- How to Speed Up Blender Cycles Renders — 40+ Tips
- Blender Shadow Catcher — Complete Guide
- How to Use Motion Blur in Blender
- How to Render Transparent Animation Videos in Blender
- An Easy and Quick Way to Post-Process Your Work in Blender
- Blender Viewport Render — Fast Preview Techniques
- How to Bake Animations and Simulations to Keyframes in Blender
- How to Render Transparent PNG Images in Blender’s Cycles and Eevee
- How to Set Up a Video or Image Sequence Texture in Blender
- Best Blender Addons for Animation
- AI-Powered Motion Capture — What It Can Do for Your Workflow
- Animating Giant Creatures That Feel Real
- All About Visual Effects (VFX)
Frequently Asked Questions
Why use After Effects for tracking instead of Blender’s built-in tracker? After Effects’ 3D Camera Tracker is one of the most refined and accessible camera solvers available. It handles a wide variety of footage reliably, requires very little manual setup, and produces clean solves quickly. Blender’s built-in motion tracker is capable but requires more manual effort to get comparable results. For many production workflows, it makes sense to track in the tool that does it best, then composite and render in Blender.
Can I transfer other things besides the tracked camera and plane? Yes. The script and add-on handle all layer types that After Effects exports, including null objects, solid layers, lights, and 3D layers. This means you can also use this workflow to transfer complex After Effects motion graphics rigs — animated position, rotation, scale, and opacity — into Blender as animated objects. It is not limited to camera tracking.
Does this work for vertical footage (Instagram, TikTok)? Yes. Enable Use Comp Resolution during import, and Blender’s output resolution will automatically be set to match the composition. For example, a 1080×1920 composition in After Effects will correctly set Blender to 1080×1920. You may also need to adjust the Aspect X and Y ratio in Output Properties if the footage uses non-square pixels.
The camera animates but my 3D objects drift away from the surface over time — what is happening? This is usually a frame rate mismatch. Even a difference of 29.97 vs 30fps will cause gradual drift. Go to Output Properties and set the exact frame rate, including using the correct presets (NTSC for 29.97, Film for 23.976). Also check that you have applied the one-frame offset fix described in Step 7.
Can I use this for 2D tracking (point tracking / planar tracking)? The workflow is designed for 3D Camera Tracker data, which produces a 3D solved camera. Standard 2D point tracking in After Effects does not produce 3D camera data, so it cannot be transferred in the same way. However, if you use a 3D camera solve even for scenes that feel like 2D (like a flat wall), the workflow will still produce a usable result.
Will this work for a camera that does not move? Yes. A locked-off (static) camera still exports correctly — it will simply have no animation keyframes on position or rotation. This is actually the simplest case and works reliably.
Keep Building and Share What You Make
This workflow bridges two of the most powerful tools in visual effects production, and once you have it down, it opens up a huge range of creative possibilities — product visualization on real footage, creature placement, architectural previsualization, and much more.
If this guide helped you get your tracking data into Blender, share it with another artist who is working through the same problem. And if you hit a step that this article did not cover well, drop your question in the comments below — it might become the next addition to this guide.
Subscribe to the Gachoki Studios blog to get new tutorials as soon as they are published, and check back on this page regularly — it is updated as the tools and workflows evolve.





Hey, thanks for this addons- but i would like to know, if i have mi AE camera in vertical format, like instagram histories, when i import in Blender, it´s by default in 1920×1080. How can i config that?
Hi Juan. After importing camera data into Blender, go to ‘Output Properties’ and set X and Y resolution size same as your camera resolution size.
I shot my footage on 4K vertical. I did all the processes and it went well except my footage in blender is still horizontal until I set my X & Y to 1080×1920. Kindly help Gachoki
Under ‘Output Properties’ tab, go to ‘Format’. Adjust the x and y values under ‘Aspect’ until you get desired results.
thanks this is really good
Thank you. Glad we could help.
when I export after effects data it’s coming as text document not JSON format. i couldn’t import the data in Blender
Hi Christopher
Open the file you downloaded with Notepad++. On Notepad++, go File > Save As. On the popup window under Save as type:, Select JSON file. You can now import the JSON file into Blender.
Thank you so much !!!
One more doubt, After I import the camera data, camera is going under the grid. How to align the imported camera orientation to the grid.
The camera adapts the coordinates from After Effects tracking coordinates. Try to arrange your assets according to the camera path.
Thanks a lot !!! For the quick response
Any time.
when trying to import the file into blender, the actual export isn’t appearing. the time on folder location it should be in is updating as if something is in there but when i click on the folder it’s empty.
this could be something on my end.
could it be because im mac?
I haven’t given it a try on a Mac. It works on Windows just fine. Try on Windows to see if you get same results.
Hi! Great plugins! However, I have encountered an issue after weeks of debugging, and I still can’t solve it. I’m hoping to get some help.
After importing JSON from After Effects into Blender, all the transformations are mostly accurate, but the position is shaking around the tracking spot. This issue seems to be more noticeable when scaling up.
I’m wondering if it’s because I’m using the AE plugin Geolayers, and the scale and position values are quite huge (around 200000 for position and 75000 for scale). I’m not sure if these values exceed the maximum allowable limit.
Is it possible to fix it?
I am not familiar with Geolayers plugin. However the principal used in our setup is simply transferring the camera with all it’s animation from After Effects into Blender.
After using track camera in AE to set up the camera data, I use your script and it brings everything into blender which is great, but the camera animations are empty, they do not move at all.
I just gave it a try in Blender 3.5.1 and works fine.
My dialogue box says “Bake Transforms” instead of “Comp camera is centered” Maybe I installed it wrong? I will try downloading again.
Nothing In have tried has worked. Also the AE menu says export composition data instead of export camera data. It brings everything in but never with any keyframes.
Okay. I will do further tests and advise.
I made a whole new scene and the camera came in to Blender with keyframes. Not sure why my other scene does not work, but if I find out I will let you know.
Thank you for the update. I’m glad it finally worked.
The dialogue box doesn’t have the “comp camera is centered” checkbox. I wish there was because for the moment I can’t tell blender where the center is 🙁
We’ll check it out.
does it work on blender 3.5? doesnt seem like tbh
Yeah, it works with Blender 3.5
Just checked on blender 3.6 and it worked. Everybody should look at the right frames! Say, if you’re exporting an animated camera that in After Effects it starts in frame 1300 and ends in frame 1800, you should check there in blender, at THOSE FRAMES.
I feel like there’s a step missing here as to how I track the footages camera motion in AE and what do I do with that / the camera object I create?
Thank you Gachoki!!!!
You’re welcome Aldo!
Там же же ÑкÑпорт композиции, а не камеры из афтер Ñффекта
Извините, Ñ Ð¿Ð¾Ð½Ñл ваш вопроÑ
why can’t I export the files to blender I wont let me select the file ?
Have you installed the addon on Blender?
yes I can see it on blender but I wont let me select. is it cause I’m. on Mac or I have to tweak the setting
I am not sure if that is the reason. I have tested it again on windows and works as expected.
hey it doesnt work in blender 4.1? just gives an error message
It works in Blender 4.1. Download and install the latest version of the addon from the link above.
Hi Mr. Gachoki, when i try to import adobe after effects tracking data .json file the blender does not respond, and hence the operation fails. What should i do ?
Hi,
We are working on an update for the addon.
I will let you known when it is ready
Where can I find the script to use in Aftereffects?
The link is in this article.
Does this work for blender 4.5? It worked perfectly fine for year while i was using version 4.2 of blender, but i switched to 4.5 and it does not work properly, it does not import the keyframes, i’ve tried some stuff but nothing worked, what should i do except going back to 4.2?😭
And one more issue, in AE tracking looks good and smooth, but after importing it in Blender keyframes seem to be a bit of and more laggy, is there any advice on how to solve this? (happens in blender 4.2)
We are aware and working on a fix for this.
For now you can import the data using a Blender version that works then open and continue using the file in Blender 4.5
Im gonna try this, tywm
I am encountering this error every time I upload my data from After Effects 26 into Blender version 5. I would like to know whether this issue is caused by compatibility problems with the latest versions of both software, or if I may be doing something incorrectly.
I would appreciate your assistance in resolving this issue. The installation process was clearly explained and completed successfully, so I am unsure where the problem may be occurring.
We are working on a solution for this