Talk:Problem with 3Com 10/100 Ethernet card not being recognized
Why does the script assume that the IO port numbers in the output of "lspci -v" are in decimal? In the versions I checked (2.1.10 and 2.1.99-test8), they're in hex.
Why does the script assume that the IO port numbers in the output of "lspci -v" are in decimal? In the versions I checked (2.1.10 and 2.1.99-test8), they're in hex.