You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When filter_descr is "format=yuv420p,scale=640:480"
Most of the time,filt_frame data linesize not align,CheckDataYUV() will be throw exception
A few times,CheckDataYUV() is passed,Very strange
If filter_descr only format=yuv420p
CheckDataYUV() always passed
But send filt_frame to H264 Encoder make MP4 file,Will be only I Frames,No B-frames and P-frames,And the video quality is very blurry
The good news is that if you use the VideoFrameConverter class in FFmpeg.AutoGen.Example to turn the filt_frame one more time, e.g. using the code:
var size = new Size(filt_frame->width, filt_frame->height);
using var vfc = new VideoFrameConverter(size, AVPixelFormat.AV_PIX_FMT_YUV420P, size, AVPixelFormat.AV_PIX_FMT_YUV420P);
var frame_final = vfc.Convert(*filt_frame);
The frame_final will not have any problems, the YUV data alignment will be validated, the send to the H264 encoder will be completely normal, the I-frames, B-frames and P-frames will be available, the quality will not be blurred, and no matter what the filter_descr is configured to do, the converted frame_final will always be guaranteed by the VideoFrameConverter. YUV data alignment, H264 encoder can always work properly.
I've been researching this for days, I've checked Google, GPT, and Gemini all over, and no matter how I configure the various parameters of the format filter or scale filter, such as their color space, color range, and what not, even including configuring the various parameters of the buffer filter, it doesn't work, and in the end, I can only use it one additional time vfc.Convert(*filt Convert(*filt_frame) to the same resolution and pixel format once more, there is no way to use the filter to directly complete the conversion of resolution, pixel format, and flip the mirror and a series of work!
The final problem must be that there is something wrong with the filters output filt_frame data format, it should be a data alignment problem, but I can't determine whether it's ffmpeg's avfilter originally (you need to manually fix the data alignment problem after outputting filt_frame), or whether there is something missing in the wrapper of FFmpeg.AutoGen, or whether it's just that I've configured it somewhere! Problems
I studied the VideoFrameConverter class, in the Convert function, two lines of critical code
This UpdateFrom function should not be found in the ffmpeg C source code.
AVFrame's data and linesize are byte_ptr8 and int8 respectively, but _dstData and _dstLinesize are byte_ptr4 and int4 respectively.
Their corresponding UpdateFrom function seems to do pointer conversion work, I'm not familiar with C and C++, I can't read it, is it because of .NET, the filt_frame obtained by calling ffmpeg's AVFilter in the way of FFmpeg.AutoGen needs to do extra pointer fixing work on data and linesize in order to ensure data alignment? Work to keep the data aligned?
Here are my log records and test source code:
ffmpeg log:
Running in 64-bit mode.
FFmpeg version info: N-116716-g211c88b9d5-20240816
[swscaler @ 000002b8c2382100] deprecated pixel format used, make sure you did set range correctly
[in @ 000002b89ed51480] Changing video frame properties on the fly is not supported by all filters.
[in @ 000002b89ed51480] filter context - w: 2592 h: 1944 fmt: 13 csp: unknown range: unknown, incoming frame - w: 2592 h: 1944 fmt: 13 csp: bt470bg range: pc pts_time: 25080.955295
[swscaler @ 000002b8c23359c0] deprecated pixel format used, make sure you did set range correctly
[swscaler @ 000002b8c23359c0] deprecated pixel format used, make sure you did set range correctly
[swscaler @ 000002b8c23359c0] deprecated pixel format used, make sure you did set range correctly
Y Size: 307200, U Size: 76800, V Size: 76800
Y Data pointer: 2992562114304, U Data pointer: 2992555990336, V Data pointer: 2992554987008
frame->data[1] - frame->data[0] - _ySize:-6431168,_ySize:307200
frame->data[2] - frame->data[1] - _uSize:-1080128,_uSize:76800
Y plane linesize: 640
U plane linesize: 320
V plane linesize: 320
Unhandled exception. System.Exception: Invalid
My test code program.cs
using FFmpeg.AutoGen;
using FFmpeg.AutoGen.Example;
using FFmpeg.OSDepends;
using System;
using System.Drawing;
using System.IO;
namespace filtering_video
{
internal unsafe class Program
{
//const string filter_descr = "scale=78:24,transpose=cclock";
const string filter_descr = "format=yuv420p,scale=640:480";
/* other way:
scale=78:24 [scl]; [scl] transpose=cclock // assumes "[in]" and "[out]" to be input output pads respectively
*/
static AVFormatContext* fmt_ctx;
static AVCodecContext* dec_ctx;
static AVFilterContext* buffersink_ctx;
static AVFilterContext* buffersrc_ctx;
static AVFilterGraph* filter_graph;
static int video_stream_index = -1;
static long last_pts = ffmpeg.AV_NOPTS_VALUE;
static Size frameSize = new Size(640, 480);
static long _linesizeY = frameSize.Width;
static long _linesizeU = frameSize.Width / 2;
static long _linesizeV = frameSize.Width / 2;
static long _ySize = _linesizeY* frameSize.Height;
static long _uSize = _linesizeU* frameSize.Height / 2;
public static void CheckDataYUV(AVFrame* frame)
{
int ySize = frame->linesize[0] * frame->height;
int uSize = frame->linesize[1] * (frame->height / 2);
int vSize = frame->linesize[2] * (frame->height / 2);
Console.WriteLine($"Y Size: {ySize}, U Size: {uSize}, V Size: {vSize}");
Console.WriteLine($"Y Data pointer: {(long)frame->data[0]}, U Data pointer: {(long)frame->data[1]}, V Data pointer: {(long)frame->data[2]}");
Console.WriteLine($"frame->data[1] - frame->data[0] - _ySize:{frame->data[1] - frame->data[0] - _ySize},_ySize:{_ySize}");
Console.WriteLine($"frame->data[2] - frame->data[1] - _uSize:{frame->data[2] - frame->data[1] - _uSize},_uSize:{_uSize}");
Console.WriteLine($"Y plane linesize: {frame->linesize[0]}");
Console.WriteLine($"U plane linesize: {frame->linesize[1]}");
Console.WriteLine($"V plane linesize: {frame->linesize[2]}");
if (frame->data[1] - frame->data[0] < ySize || frame->data[2] - frame->data[1] < uSize)
{
throw new Exception("Invalid");
}
}
static unsafe int Main(string[] args)
{
FFmpegBinariesHelper.RegisterFFmpegBinaries();
#if DEBUG
Console.WriteLine("Current directory: " + Environment.CurrentDirectory);
Console.WriteLine("Running in {0}-bit mode.", Environment.Is64BitProcess ? "64" : "32");
Console.WriteLine($"FFmpeg version info: {ffmpeg.av_version_info()}");
Console.WriteLine();
#endif
int ret;
AVPacket* packet;
AVFrame* frame;
AVFrame* filt_frame;
frame = ffmpeg.av_frame_alloc();
filt_frame = ffmpeg.av_frame_alloc();
packet = ffmpeg.av_packet_alloc();
if (frame == null || filt_frame == null || packet == null)
{
Console.WriteLine("Could not allocate frame or packet");
return 1;
}
string dirPath = Path.GetDirectoryName(typeof(Program).Assembly.Location) ?? "";
// https://file-examples.com/index.php/sample-video-files/sample-mp4-files/
string inputfile = Path.Combine(dirPath, "..", "..", "..", "..", "Samples", "file_example_MP4_1920_18MG.mp4");
if ((ret = open_input_file(inputfile)) < 0)
{
goto end;
}
if ((ret = init_filters(filter_descr)) < 0)
{
goto end;
}
while (true)
{
if ((ret = ffmpeg.av_read_frame(fmt_ctx, packet)) < 0)
{
break;
}
if (packet->stream_index == video_stream_index)
{
ret = ffmpeg.avcodec_send_packet(dec_ctx, packet);
if (ret < 0)
{
ffmpeg.av_log(null, ffmpeg.AV_LOG_ERROR, "Error while sending a packet to the decoder");
break;
}
while (ret >= 0)
{
ret = ffmpeg.avcodec_receive_frame(dec_ctx, frame);
if (ret == ffmpeg.AVERROR(ffmpeg.EAGAIN) || ret == ffmpeg.AVERROR_EOF)
{
break;
}
else if (ret < 0)
{
ffmpeg.av_log(null, ffmpeg.AV_LOG_ERROR, "Error while receiving a frame from the decoder");
goto end;
}
frame->pts = frame->best_effort_timestamp;
if (ffmpeg.av_buffersrc_add_frame_flags(buffersrc_ctx, frame, (int)AV_BUFFERSRC_FLAG.KEEP_REF) < 0)
{
ffmpeg.av_log(null, ffmpeg.AV_LOG_ERROR, "Error while feeding the filtergraph");
break;
}
while (true)
{
ret = ffmpeg.av_buffersink_get_frame(buffersink_ctx, filt_frame);
if (ret == ffmpeg.AVERROR(ffmpeg.EAGAIN) || ret == ffmpeg.AVERROR_EOF)
{
break;
}
if (ret < 0)
{
goto end;
}
CheckDataYUV(filt_frame);
//display_name(filt_frame, buffersink_ctx->inputs[0]->time_base);
ffmpeg.av_frame_unref(filt_frame);
}
ffmpeg.av_frame_unref(frame);
}
}
ffmpeg.av_packet_unref(packet);
}
end:
fixed (AVFilterGraph** pfilter = &filter_graph)
{
ffmpeg.avfilter_graph_free(pfilter);
}
fixed (AVCodecContext** pdec_ctx = &dec_ctx)
{
ffmpeg.avcodec_free_context(pdec_ctx);
}
fixed (AVFormatContext** pfmt_ctx = &fmt_ctx)
{
ffmpeg.avformat_close_input(pfmt_ctx);
}
ffmpeg.av_frame_free(&frame);
ffmpeg.av_frame_free(&filt_frame);
ffmpeg.av_packet_free(&packet);
if (ret < 0 && ret != ffmpeg.AVERROR_EOF)
{
Console.WriteLine($"Error occurred: {FFmpegHelper.av_strerror(ret)}");
return 1;
}
return 0;
}
static unsafe int open_input_file(string filename)
{
ffmpeg.avdevice_register_all();
AVCodec* dec;
int ret;
AVInputFormat* pInputFormat = ffmpeg.av_find_input_format("dshow");
filename = "video=XHWS-179-AF1";
fixed (AVFormatContext** pfmt_ctx = &fmt_ctx)
{
if ((ret = ffmpeg.avformat_open_input(pfmt_ctx, filename, pInputFormat, null)) < 0)
{
ffmpeg.av_log(null, ffmpeg.AV_LOG_ERROR, "Cannot open input file");
return ret;
}
}
if ((ret = ffmpeg.avformat_find_stream_info(fmt_ctx, null)) < 0)
{
ffmpeg.av_log(null, ffmpeg.AV_LOG_ERROR, "Cannot find stream information");
return ret;
}
ret = ffmpeg.av_find_best_stream(fmt_ctx, AVMediaType.AVMEDIA_TYPE_VIDEO, -1, -1, &dec, 0);
if (ret < 0)
{
ffmpeg.av_log(null, ffmpeg.AV_LOG_ERROR, "Cannot find a video stream in the input file");
return ret;
}
video_stream_index = ret;
dec_ctx = ffmpeg.avcodec_alloc_context3(dec);
if (dec_ctx == null)
{
return ffmpeg.AVERROR(ffmpeg.ENOMEM);
}
ffmpeg.avcodec_parameters_to_context(dec_ctx, fmt_ctx->streams[video_stream_index]->codecpar);
if ((ret = ffmpeg.avcodec_open2(dec_ctx, dec, null)) < 0)
{
ffmpeg.av_log(null, ffmpeg.AV_LOG_ERROR, "Cannot open video decoder");
return ret;
}
return ret;
}
static unsafe int init_filters(string filter_descr)
{
int ret = 0;
AVFilter* buffersrc = ffmpeg.avfilter_get_by_name("buffer");
AVFilter* buffersink = ffmpeg.avfilter_get_by_name("buffersink");
AVFilterInOut* outputs = ffmpeg.avfilter_inout_alloc();
AVFilterInOut* inputs = ffmpeg.avfilter_inout_alloc();
AVRational time_base = fmt_ctx->streams[video_stream_index]->time_base;
Span<int> pix_fmts = stackalloc int[]
{
(int)AVPixelFormat.AV_PIX_FMT_YUV420P,
};
filter_graph = ffmpeg.avfilter_graph_alloc();
if (outputs == null || inputs == null || filter_graph == null)
{
ret = ffmpeg.AVERROR(ffmpeg.ENOMEM);
goto end;
}
string args = $"video_size={dec_ctx->width}x{dec_ctx->height}:pix_fmt={(int)dec_ctx->pix_fmt}:"
+ $"time_base={time_base.num}/{time_base.den}:pixel_aspect={dec_ctx->sample_aspect_ratio.num}/{dec_ctx->sample_aspect_ratio.den}";
fixed (AVFilterContext** pbuffersrc_ctx = &buffersrc_ctx)
{
ret = ffmpeg.avfilter_graph_create_filter(pbuffersrc_ctx, buffersrc, "in", args, null, filter_graph);
if (ret < 0)
{
ffmpeg.av_log(null, ffmpeg.AV_LOG_ERROR, "Cannot create buffer source");
goto end;
}
}
fixed (AVFilterContext** pbuffersink_ctx = &buffersink_ctx)
{
ret = ffmpeg.avfilter_graph_create_filter(pbuffersink_ctx, buffersink, "out", null, null, filter_graph);
if (ret < 0)
{
ffmpeg.av_log(null, ffmpeg.AV_LOG_ERROR, "Cannot create buffer sink");
goto end;
}
}
fixed (int* pfmts = pix_fmts)
{
ret = ffmpeg.av_opt_set_bin(buffersink_ctx, "pix_fmts", (byte*)pfmts, pix_fmts.Length * sizeof(int), ffmpeg.AV_OPT_SEARCH_CHILDREN);
if (ret < 0)
{
ffmpeg.av_log(null, ffmpeg.AV_LOG_ERROR, "Cannot set output pixel format");
goto end;
}
}
outputs->name = ffmpeg.av_strdup("in");
outputs->filter_ctx = buffersrc_ctx;
outputs->pad_idx = 0;
outputs->next = null;
inputs->name = ffmpeg.av_strdup("out");
inputs->filter_ctx = buffersink_ctx;
inputs->pad_idx = 0;
inputs->next = null;
if ((ret = ffmpeg.avfilter_graph_parse_ptr(filter_graph, filter_descr, &inputs, &outputs, null)) < 0)
{
goto end;
}
if ((ret = ffmpeg.avfilter_graph_config(filter_graph, null)) < 0)
{
goto end;
}
end:
ffmpeg.avfilter_inout_free(&inputs);
ffmpeg.avfilter_inout_free(&outputs);
return ret;
}
static unsafe void display_name(AVFrame* frame, AVRational time_base)
{
int x, y;
byte* p0;
byte* p;
long delay;
string drawing = " .-+#";
if (frame->pts != ffmpeg.AV_NOPTS_VALUE)
{
if (last_pts != ffmpeg.AV_NOPTS_VALUE)
{
delay = ffmpeg.av_rescale_q(frame->pts - last_pts, time_base, FFmpegHelper.AV_TIME_BASE_Q);
if (delay > 0 && delay < 1_000_000)
{
NativeMethods.uSleep(delay); // https://www.sysnet.pe.kr/2/0/12980
}
}
last_pts = frame->pts;
}
p0 = frame->data[0];
Console.Clear();
for (y = 0; y < frame->height; y++)
{
p = p0;
for (x = 0; x < frame->width; x++)
{
Console.Write(drawing[*(p++) / 52]);
}
Console.WriteLine();
p0 += frame->linesize[0];
}
}
}
}
The text was updated successfully, but these errors were encountered:
AVFilter Example code for https://github.com/stjeong/ffmpeg_autogen_cs/tree/master/filtering_video
My FFmpeg Lib is 7.0.2
When filter_descr is
"format=yuv420p,scale=640:480"
Most of the time,filt_frame data linesize not align,CheckDataYUV() will be throw exception
A few times,CheckDataYUV() is passed,Very strange
If filter_descr only
format=yuv420p
CheckDataYUV() always passed
But send filt_frame to H264 Encoder make MP4 file,Will be only I Frames,No B-frames and P-frames,And the video quality is very blurry
The good news is that if you use the VideoFrameConverter class in FFmpeg.AutoGen.Example to turn the filt_frame one more time, e.g. using the code:
The frame_final will not have any problems, the YUV data alignment will be validated, the send to the H264 encoder will be completely normal, the I-frames, B-frames and P-frames will be available, the quality will not be blurred, and no matter what the filter_descr is configured to do, the converted frame_final will always be guaranteed by the VideoFrameConverter. YUV data alignment, H264 encoder can always work properly.
I've been researching this for days, I've checked Google, GPT, and Gemini all over, and no matter how I configure the various parameters of the format filter or scale filter, such as their color space, color range, and what not, even including configuring the various parameters of the buffer filter, it doesn't work, and in the end, I can only use it one additional time vfc.Convert(*filt Convert(*filt_frame) to the same resolution and pixel format once more, there is no way to use the filter to directly complete the conversion of resolution, pixel format, and flip the mirror and a series of work!
The final problem must be that there is something wrong with the filters output filt_frame data format, it should be a data alignment problem, but I can't determine whether it's ffmpeg's avfilter originally (you need to manually fix the data alignment problem after outputting filt_frame), or whether there is something missing in the wrapper of FFmpeg.AutoGen, or whether it's just that I've configured it somewhere! Problems
I studied the VideoFrameConverter class, in the Convert function, two lines of critical code
This UpdateFrom function should not be found in the ffmpeg C source code.
AVFrame's data and linesize are byte_ptr8 and int8 respectively, but _dstData and _dstLinesize are byte_ptr4 and int4 respectively.
Their corresponding UpdateFrom function seems to do pointer conversion work, I'm not familiar with C and C++, I can't read it, is it because of .NET, the filt_frame obtained by calling ffmpeg's AVFilter in the way of FFmpeg.AutoGen needs to do extra pointer fixing work on data and linesize in order to ensure data alignment? Work to keep the data aligned?
Here are my log records and test source code:
ffmpeg log:
My test code program.cs
The text was updated successfully, but these errors were encountered: