I'm just i a planning phase of an app i might possibly make, and I came across a challenge. In order to accomplish my goal it seams to me that my program will need to act as a microphone and a speaker to the OS. I'm considering to make my program for both Windows and OS X, so my question is as follows:
What libraries do i need in the two operating systems to make them think my program is an audio device? I need to be able to both emulate audio input and output. On Windows i have a feeling that it might be DirectX or something, whereas on OS X i have no clue.
Also if such a library exists, and you have one to suggest, please also suggest a source of documentation. =)
Thanks in advance.
You make your program an actual device and write a device driver for it.
For device driver development on Windows start at http://msdn.microsoft.com/en-us/library/windows/hardware/gg487428.aspx
For device driver development on Os X start at https://developer.apple.com/library/mac/navigation/index.html#section=Topics&topic=Drivers%2C%20Kernel%2C%20%26amp%3B%20Hardware
Related
I'm working on a project to investigate the possibility of adding touch support to an application and so far the findings have been somewhat disappointing. My company uses Scientific Linux 6.4 (Linux kernel 2.6.32) and so far, I've found information suggesting that 2.6.30+ supports multi-touch HID, but I've also seen information suggesting that the multi-touch in this kernel doesn't work with Xorg interfaces.
Putting aside market availability of touchscreens that are compatible with Linux, is there a way we can verify whether or not multi-touch inputs are generated on the system? We have an older ViewSonic touchscreen that has multi-touch capability, and after looking at the output from the evtest tool, I didn't notice any multi-touch events, but I don't know whether evtest is reading the touch events from X or the hardware level.
I have no experience dealing with hardware programming or device drivers, so if anyone could give me some guidance on how to verify multi-touch HID compatibility with our version of Linux, whether we have to write our own driver, or read raw data from somewhere, any information you could provide would be great.
EDIT: The evtest program lists supported events for the device and I don't see anything related to multi-touch, so it doesn't seem like it's supported, but is this an issue with the kernel, the specific device, or something else? The specific monitor I'm testing is a ViewSonic, which is listed as a "Quanta Optical Touchscreen" device. I saw somewhere that a driver for Quanta was added in 2.6.34. Am I just out of luck (for this particular device at least)?
I went ahead and tested the monitor with a laptop that had Linux kernel 4.4.0, and it worked right away, so it seems that it's definitely the kernel. I don't know if there are any touchscreens that WILL work with kernel 2.6.32, but the ones based on the Quanta display definitely don't work.
I have USB oscilloscope Velleman PCSU 1000 which is Windows-only, and I'm thinking of ways to make it work in Linux.
I've come across interesting publication on driver reverse-engineering, so, at least in theory, I can reverse-engineer the driver for the oscilloscope, and then write Linux driver for it. But I have no idea how to make their proprietary software work in Wine with my new driver? Is it possible?
If not, then it seems I have only one option: write not only the driver, but reimplement the software as well. This is awful amount of work, unfortunately.
And, as far as I know, there is no common device class for this kind of devices, so I can't write some generic driver to make any other similar software work with the oscilloscope. Right?
Anyway, any suggestions and ideas are highly appreciated.
Context
Debian 64 bit. kernel 3.18.x
Litterally struggling to understand how a network driver is initialized.
I mean how to choose which flag to set. I dig in the kernel for days now to train myself. The card setup is the only point I miss.
I take the intel 82574 as an example. I downloaded the card's datasheet, saw many information but not a clue on how to setup the hardware.
Question
Where to start to know what flags to set ? The datasheet didn't helped me (i am not very experienced but willing to learn).
Please give me a starting point, a tip or anything to help me understand what is going on in the already written open sourced driver.
How can a developer knows how to initialize his nic ? (yes reinventing the wheel the time to understand)
You'll need to read the source code of the kernel module that handles your specific NIC.
EDIT: Of course, to develop such a module, you'd usually just use a register map as specified in a data sheet or application node; often, manufacturers develop their linux drivers themselves, so the driver developers might even be the same people that developed the chipset (because it's really handy to have a platform to test against -- it's impossible to test hardware without having something like a driver, so you might as well write a proper driver).
Furthermore, devices often come with code examples -- no one is going to build a device based on an IC that he has not seen in action.
If you've got access to neither proper documentation nor source, you can only reverse engineer - and that's an incredibly large field.
Using your example with the Intel 82574 Network Adapter, Intel provides a zip file of the source code used to build the Linux driver. The driver is like all drivers in that it hooks into the OS API for Networking.
The Linux networking API is document on both the linux.org site and discussed on popular Linux sites like lwn.org. Below is the link to lwn's chapter on Network drivers using the networking API called NAPI.
https://static.lwn.net/images/pdf/LDD3/ch17.pdf
You'll notice in the Intel igb driver source code that the NAPI net_device data structure is one of the first things that is setup. It registers the driver with the OS. This way the OS knows which igb functions to call when loading/unloading the driver, or when needing to send/receive data.
The igb functions read/modify/write the necessary bits in the 82574's memory-mapped registers that control and monitor the device. The device registers are all documented in the 82574 datasheet available on Intel's site. And this is usually the case for almost any networking company like Broadcom/Chelsio/Mellanox/Marvell.
Hope that helps a little more.
I started to write linux driver and i am confident on it now, but my interest now is to write
lower-level driver (platform driver) for spi or USB or i2c controller. Is there any i can start writing to practice platform driver on linux PC. Can some suggest how to start writing platform driver on linux.
Thank you
A good way is to look at the existing drivers, look at the list of open bugs and start to fix them. That will give you a good introduction to the kernel, you will learn to work as part of a huge, distributed team (will look good on your CV) and you will help to make the world better, one line of code at a time.
The next step is then to find some unknown, unsupported hardware and write a driver for it. The start here is to copy an existing driver or extend it (depending on how "different" the hardware is).
This is one of my coding projects. I'm fairly new to linux, so I need some pointers and thoughts from you guys, before I get started. I know there exists screen sharing software already, but I want to make my own! (=
Specifically, I want to clone my laptop screen to my TV over WLAN, via a linux box that is connected to a TV through a VGA cable:
Laptop streams it's screen
Linux box reads the stream
Linux box outputs the stream into the TV (through a VGA cable)
First of all, how do I record the screen and send the stream in real time in linux?
Secondly I must write a program that reads the stream being sent. The program must listen to some port, and collect the data being streamed from the laptop. Any thoughts?
Then I must output that data in real time to the TV. Do you how any ideas on how to solve this?
Thanks!
Edit: Regarding programming languages, I'm most comfortable with python.
Sharing your screen can be done via the various flavors of VNC (ie. RealVNC, TightVNC, UltraVNC, etc.). Most of them are Open Source, you might want to:
Stick with the VNC protocol for later compatibility
Take example of how the established solutions does for screen-hooking.
In Linux, the graphics are all processed by Xorg (new version of X Server), which was developed with networking embedded. This explains why you can ssh -X into a machine, execute a graphical interface on it and see it on your remote computer. I recommend you to read about hooks on Xorg to achieve your needs.
You need a client-server topology to achieve your needs. You are not talking about any programming language you forecast to use, though. Some languages may be harder than some to start with. Furthermore, this kind of code is already really well understood under every major programming language. You should try to at least use a framework that simplifies your networking portion of the project.
Sharing a screen on the TV can be done by your video card driver in Linux. Just check on your Desktop Environment (KDE and Gnome offers video configuration panels, for example) or in your video card configuration (nVidia and ATI Linux drivers offers multiple screen support)
It seems to me like you're trying to reinvent the wheel and are not too sure about how to begin. I recommend you to begin simple with one of the already proven VNC software and see how it goes from there. If a feature is missing, you've got the source code of the server and the client, so you can continue development of these projects. Once you've got your setup working, start thinking about replacing a single piece of the puzzle by your own code, and see how it goes.
Do not expect good (full HD, for instance) video quality on your TV without some very capable CPU/GPU and a 802.11n wireless network empty of users and be ready to accept some lag for the codecs to kick in.
You should try to take as small steps as possible. If I were taking up such a project, my first step would be to try to implement a solution using standard unix tools (e.g. netcat or socat for the network part, mplayer or vlc for the playback and maybe ffmpeg for the capture)? Then, replace each component with custom-written ones if needed.