通过Linux FrameBuffer将像素绘制到屏幕上

最近,我对一个奇怪的想法感到震惊,他想从/ dev / urandom中获取输入,将相关字符转换为随机整数,然后使用这些整数作为像素rgb /

xy值来绘制到屏幕上。

我已经做过一些研究(在StackOverflow和其他地方),许多建议您可以直接直接写入/ dev /

fb0,因为它是设备的文件表示形式。不幸的是,这似乎没有产生任何视觉上明显的结果。

我找到了一个来自QT教程(不再可用)的示例C程序,该程序使用mmap写入缓冲区。该程序成功运行,但是再次没有输出到屏幕。有趣的是,当我将笔记本电脑放入Suspend并随后恢复时,我看到了瞬间的图像闪烁(红色方块),该图像早些时候已写入帧缓冲区。在Linux中写帧缓冲区是否可以继续工作以绘画到屏幕上?理想情况下,我想编写一个(ba)sh脚本,但是使用C或类似脚本也可以。谢谢!

编辑:这是示例程序…兽医看似熟悉。

#include <stdlib.h>

#include <unistd.h>

#include <stdio.h>

#include <fcntl.h>

#include <linux/fb.h>

#include <sys/mman.h>

#include <sys/ioctl.h>

int main()

{

int fbfd = 0;

struct fb_var_screeninfo vinfo;

struct fb_fix_screeninfo finfo;

long int screensize = 0;

char *fbp = 0;

int x = 0, y = 0;

long int location = 0;

// Open the file for reading and writing

fbfd = open("/dev/fb0", O_RDWR);

if (fbfd == -1) {

perror("Error: cannot open framebuffer device");

exit(1);

}

printf("The framebuffer device was opened successfully.\n");

// Get fixed screen information

if (ioctl(fbfd, FBIOGET_FSCREENINFO, &finfo) == -1) {

perror("Error reading fixed information");

exit(2);

}

// Get variable screen information

if (ioctl(fbfd, FBIOGET_VSCREENINFO, &vinfo) == -1) {

perror("Error reading variable information");

exit(3);

}

printf("%dx%d, %dbpp\n", vinfo.xres, vinfo.yres, vinfo.bits_per_pixel);

// Figure out the size of the screen in bytes

screensize = vinfo.xres * vinfo.yres * vinfo.bits_per_pixel / 8;

// Map the device to memory

fbp = (char *)mmap(0, screensize, PROT_READ | PROT_WRITE, MAP_SHARED, fbfd, 0);

if ((int)fbp == -1) {

perror("Error: failed to map framebuffer device to memory");

exit(4);

}

printf("The framebuffer device was mapped to memory successfully.\n");

x = 100; y = 100; // Where we are going to put the pixel

// Figure out where in memory to put the pixel

for (y = 100; y < 300; y++)

for (x = 100; x < 300; x++) {

location = (x+vinfo.xoffset) * (vinfo.bits_per_pixel/8) +

(y+vinfo.yoffset) * finfo.line_length;

if (vinfo.bits_per_pixel == 32) {

*(fbp + location) = 100; // Some blue

*(fbp + location + 1) = 15+(x-100)/2; // A little green

*(fbp + location + 2) = 200-(y-100)/5; // A lot of red

*(fbp + location + 3) = 0; // No transparency

//location += 4;

} else { //assume 16bpp

int b = 10;

int g = (x-100)/6; // A little green

int r = 31-(y-100)/16; // A lot of red

unsigned short int t = r<<11 | g << 5 | b;

*((unsigned short int*)(fbp + location)) = t;

}

}

munmap(fbp, screensize);

close(fbfd);

return 0;

}

回答:

如果您正在运行X11,则必须通过X11

API才能绘制到屏幕上。在X服务器周围四处乱转(并且,如您所见,通常是行不通的)。它也可能导致崩溃,或者只是一般的显示器损坏。

如果您希望能够在任何地方(控制台和X下)运行,请查看SDL或GGI。如果只关心X11,则可以使用GTK,QT甚至Xlib。有很多很多选择…

以上是 通过Linux FrameBuffer将像素绘制到屏幕上 的全部内容, 来源链接: utcz.com/qa/432619.html

回到顶部