Hello,
As an enthusiastic soon-to-be junior developer having mastered everything in Swift for about 40 hours every week (hopefully, at the moment), I am thrilled with the idea of sliding down an iOS AVFoundation-powered native feature into your Flutter arrangement. I have gone through all starter project code, and I am hopeful that I would come in handy in testing, debugging, and implementing whatever functions might be needed.
An Overview of the Project:
The Project is designed for the following key features:
1. Record Video: Capture and manage video data by employing an AVCaptureSession.
2. Handling frames with `CMSampleBuffer`, adding timestamps, and using `DispatchQueue` for thread-safe management.
3. Video Packaging: It proccedures the AVAssetWriter that is used to do packaging of frames into video files for efficient storage.
4. S3 Upload Integration: Asynchronously uploads files of video using the AWS Swift SDK in conjunction with retry mechanisms that handle network-latent situations.
5. Debugging and Testing: Ensuring everything under the components operates in the background with the Flutter app and appropriately handles edge case executions.
I have always had a great belief and confidence in my work regarding quality results in colaboration with AVfoundation, multi-threading, and ALS integration.
Best Regards,
Farhin B.