一直以來﹐人們總愛將TCP和IP合在一起來講﹐是因為它們之間的密切關係所至。同時﹐我們也知道IP工作於網路層﹐而TCP則工作於傳送層﹐故此﹐它們的封包格式卻是不一樣的。 TCP封包格式 下面就讓我們看一看TCP封包的格式﹕
今次我不再為每一個TCP封包部件擷取樣板了﹐只打算略略討論一下它們的名稱和定義﹕
UDP還是TCP﹖ 在TCP/IP的網路﹐IP封包會透過ICMP協定來檢測對方的存在﹐而確保最大可能性的正確傳送。不過在傳送層裡面﹐除了TCP這個協定之外﹐我們還使用另一個傳輸協定﹕就是UDP (User Datagram Protocol)﹐他和TCP最大的分別是不偵測對方的存在就直接將資料送給對方﹐而假設對方會自行接收。 這樣對那些需要大樓資料存取而又不要求可靠傳輸的程式﹐如﹕聲音傳遞﹐可以省卻雙方的溝通和確認時間﹐從而提高資料傳輸量。使用UDP的程式協定例如有﹕DNS﹑SNMP﹑NFS﹑BOOTP﹑等等。
資料來源: http://www.study-area.org/network/network_tcp.htm |
2012年10月7日 星期日
TCP封包
2012年7月31日 星期二
streaming
Streaming media is multimedia that is constantly received by and presented to an end-user while being delivered by a provider. Its verb form, "to stream", refers to the process of delivering media in this manner; the term refers to the delivery method of the medium rather than the medium itself.
A client media player can begin playing the data (such as a movie) before the entire file has been transmitted. Distinguishing delivery method from the media distributed applies specifically to telecommunications networks, as most other delivery systems are either inherently streaming (e.g., radio, television) or inherently nonstreaming (e.g., books, video cassettes, audio CDs). For example, in the 1930s, muzak was among the earliest popularly available streaming media; nowadays Internet television is a common form of streamed media. The term "streaming media" can apply to media other than video and audio such as live closed captioning, stock ticker, and real-time text, which are all considered "streaming text".
Live streaming, delivering live over the Internet, involves a camera for the media, an encoder to digitize the content, a media publisher, and a content delivery network to distribute and deliver the content.
Streaming video is content sent in compressed form over the Internet and displayed by the viewer in real time. With streaming video or streaming media, a Web user does not have to wait to download a file to play it. Instead, the media is sent in a continuous stream of data and is played as it arrives. The user needs a player, which is a special program that uncompresses and sends video data to the display and audio data to speakers. A player can be either an integral part of a browser or downloaded from the software maker's Web site.
Major streaming video and streaming media technologies include RealSystem G2 from RealNetwork, Microsoft Windows Media Technologies (including its NetShow Services and Theater Server), and VDO. Microsoft's approach uses the standard MPEG compressionalgorithm for video. The other approaches use proprietary algorithms. (The program that does the compression and decompression is sometimes called the codec.) Microsoft's technology offers streaming audio at up to 96 Kbps and streaming video at up to 8 Mbps (for the NetShow Theater Server). However, for most Web users, the streaming video will be limited to the data rates of the connection (for example, up to 128 Kbps with an ISDN connection). Microsoft's streaming media files are in its Advanced Streaming Format (ASF).
Streaming video is usually sent from prerecorded video files, but can be distributed as part of a live broadcast "feed." In a live broadcast, the video signal is converted into a compressed digital signal and transmitted from a special Web server that is able to domulticast, sending the same file to multiple users at the same time.
2012年7月21日 星期六
3GPP
The term "3GPP specification" covers all GSM (including GPRS and EDGE), W-CDMA and LTE (including LTE-Advanced) specifications. The following terms are also used to describe networks using the 3G specifications: UTRAN, UMTS (in Europe) and FOMA (in Japan).
資料來源: http://www.3gpp.org/
http://en.wikipedia.org/wiki/3GPP
The 3rd Generation Partnership Project (3GPP) is a collaboration between groups of telecommunications associations, known as the Organizational Partners. The initial scope of 3GPP was to make a globally applicable third-generation (3G) mobile phone system specification based on evolved Global System for Mobile Communications (GSM) specifications within the scope of the International Mobile Telecommunications-2000 project of the International Telecommunication Union (ITU). The scope was later enlarged[1] to include the development and maintenance of:
- the Global System for Mobile Communications (GSM) including GSM evolved radio access technologies (e.g. General Packet Radio Service (GPRS) and Enhanced Data Rates for GSM Evolution (EDGE))
- an evolved third Generation and beyond Mobile System based on the evolved 3GPP core networks, and the radio access technologies supported by the Partners (i.e., UTRA both FDD and TDD modes).
- an evolved IP Multimedia Subsystem (IMS) developed in an access independent manner
3GPP standardization encompasses Radio, Core Network and Service architecture.[2] The project was established in December 1998 and should not be confused with 3rd Generation Partnership Project 2 (3GPP2), which specifies standards for another 3G technology based on IS-95 (CDMA), commonly known as CDMA2000. The 3GPP support team (also known as the "Mobile Competence Centre") is located at the ETSI headquarters in Sophia-Antipolis (France).[3]
資料來源: http://www.3gpp.org/
http://en.wikipedia.org/wiki/3GPP
2012年7月18日 星期三
CCIR vs EIA
CCIR stands for Comittee Consultatif International Radiotelecommunique. This is the committee that recommended the standards for B/W television accepted by most of Europe, Australia and others. This is why when we refer to equipment that complies with the B/W TV standards we call it CCIR compatible. The same "type" of standard, but later extended to colour signals, was called PAL. The name comes from the concept used for the colour reproduction by alternate phase changes of the colour carrier at each new line, hence: Phase Alternate Line-PAL.
EIA stands for Electronics Industry Association, an association that created the standard for B/W television in the USA, Canada and Japan, where it is often referred to as RS-170, it being the recommendation code of the EIA proposal. When B/W TV was upgraded to colour, it was named by the group that created the recommendation: National Television Systems Committee, or abbreviated NTSC.
2012年7月17日 星期二
parallel releases
Need for parallel releases
If you need to develop multiple versions of your system in parallel, consider using separate projects, one for each version. For example, your organization may need to work on a patch release and a new release at the same time. In this situation, both projects use mostly the same set of components. (Note that multiple projects can modify the same set of components.) When work on the patch release project is complete, you integrate it with the new release project.
If you anticipate that your team will develop and release numerous versions of your system over time, you may want to create a mainline project. A mainline project serves as a single point of integration for related projects over a period of time.
If you anticipate that your team will develop and release numerous versions of your system over time, you may want to create a mainline project. A mainline project serves as a single point of integration for related projects over a period of time.
Figure 1 shows the initial set of components planned for the Transaction Builder system. A team of 30 developers work on the system. Because a high degree of integration between components is required, and most developers work on several components, the project manager included all components in one project. You can use multiple UCM projects for your development.
2012年7月10日 星期二
Pelco D/P protocols
Introduction
This is GPL software. Do with it as you want, but feed us back any improvements.
This is a full C# classes to control a PELCO PTZ cameras, matrix switching systems,
reciever devices and more via RS422/485 'P' and 'D' protocol.
It supports all of the commands including UP, DOWN, IN, OUT, LEFT, RIGHT, NEAR, FAR, as well as other extended commands.
reciever devices and more via RS422/485 'P' and 'D' protocol.
It supports all of the commands including UP, DOWN, IN, OUT, LEFT, RIGHT, NEAR, FAR, as well as other extended commands.
To use this, you need to put a RS232->RS422 adapter on the output of your desired serial port.
The Pelco doesn't return ANY usefull info back, so you only really need 2-wire support (one way) communications out. However, I advice to read it in order to know if the command was recieved by the device.
This section describes the protocol used when sending commands to an Intercept Dome in the “P” version protocol and Coaxitron series equipment and with Pelco’s “D” version receivers. Those protocols use no parity, one start bit, eight data bits, and one stop bit. The recommended baud rate is 4800 (4800, 8, N, 1, 1).
Theory
In those protocols the messages structure are different. However both of protocols using RS-485 port to send and recieve messages.
All values below are shown in hexadecimal (base 16).
Pelco P message structure
Byte
|
Value
|
Function
|
1
|
$A0
|
STX (start transmission)
|
2
|
$00 to $1F
|
Address
|
3
|
Data byte 1
|
(see below)
|
4
|
Data byte 2
|
(see below)
|
5
|
Data byte 3
|
(see below)
|
6
|
Data byte 4
|
(see below)
|
7
|
$AF
|
ETX (end transmission)
|
8
|
$00-$FF
|
Check Sum
|
Byte 1 is always $A0
Byte 2 is the receiver address, set by DIP switch in the receiver
Byte 3-6, see below
Byte 7 is always $AF
Byte 8 is an XOR sum of Bytes 1-7
Byte 2 is the receiver address, set by DIP switch in the receiver
Byte 3-6, see below
Byte 7 is always $AF
Byte 8 is an XOR sum of Bytes 1-7
The protocol is “zero indexed” so that the hexadecimal address sent in the protocol for the first receiver is $00 which corresponds to address 1.
Pelco D message structure
The “D” protocol has some added overhead to improve the reliability of transmissions. The format for a message is:
Word 1
|
Word 2
|
Word 3
|
Word 4
|
Word 5
|
Word 6
|
Word 7
|
Synch Byte
|
Address
|
Command 1
|
Command 2
|
Data 1
|
Data 2
|
Check Sum
|
The synchronization byte is always $FF.
The address is the logical address of the receiver/driver being controlled.
Bit 7
|
Bit 6
|
Bit 5
|
Bit 4
|
Bit 3
|
Bit 2
|
Bit 1
|
Bit 0
| |
Command 1
|
Sense
|
Reserved
|
Reserved
|
Auto / Manual Scan
|
Camera On / Off
|
Iris Close
|
Iris Open
|
Focus Near
|
Command 2
|
Focus Far
|
Zoom Wide
|
Zoom Tele
|
Down
|
Up
|
Left
|
Right
|
Always 0
|
The sense bit (command 1 bit 7) indicates the meaning of bits 4 and 3. If the sense bit is on and bits 4 and 3 are on the command will enable autoscan and turn the camera on. If the sense bit is off and bits 4 and 3 are on the command will enable manual scan and turn the camera off. Of course, if either bit 4 or bit 3 are off then no action will be taken for those features.
The reserved bits (6 and 5) should be set to 0.
Word 5 contains the pan speed. Pan speed is in the range $00 (stop) to $3F (high speed) and $FF for “turbo” speed. Turbo speed is the maximum speed the device can obtain and is considered separately because it is not generally a smooth step from high speed to turbo. That is, going from one speed to the next usually looks smooth and will provide for smooth motion with the exception of going into and out of turbo speed.
Word 6 contains the tilt speed. Tilt speed is in the range $00 (stop) to $3F (maximum speed).
Word 7 is the check sum. The check sum is the sum of bytes (excluding the synchronization byte) modulo 256.
Extended Command Set
In addition to the “PTZ” commands shown above, there are control commands that allow you access to the more advanced features of some equipment.For Pelco P protocol the extended command set will have bit 0 of data byte 2 set and will follow the format in the following table:
Command
|
Data byte 1
|
Data byte 2
|
Data byte 3
|
Data byte 4
|
Set Preset XX
|
00
|
03
|
00
|
01 to FF
|
Clear Preset XX
|
00
|
05
|
00
|
01 to FF
|
Go To Preset XX
|
00
|
07
|
00
|
01 to FF
|
Flip (rotate 180º)
|
00
|
07
|
00
|
21
|
Zero Pan Position
|
00
|
07
|
00
|
22
|
Auto scan
|
00
|
09
|
00
|
00
|
Stop auto scan
|
00
|
0B
|
00
|
00
|
Remote Reset
|
00
|
0F
|
00
|
00
|
Zone Start
|
00
|
11
|
00
|
01 to 08
|
Zone End
|
00
|
13
|
00
|
01 to 08
|
Write char to screen
|
00
|
15
|
0 to 28
|
0 to 7F
|
Clear Screen
|
00
|
17
|
00
|
00
|
Alarm Ack
|
00
|
19
|
00
|
01 to 08
|
Zone Scan On
|
00
|
1B
|
00
|
00
|
Zone Scan Off
|
00
|
1D
|
00
|
00
|
Pattern Start
|
00
|
1F
|
00
|
00
|
Pattern Stop
|
00
|
21
|
00
|
00
|
Run Pattern
|
00
|
23
|
00
|
00
|
Zoom Lens Speed
|
00
|
25
|
00
|
00 to 03
|
Focus Lens Speed
|
00
|
27
|
00
|
00 to 03
|
In Pelco D implementation they are as following:
Command
|
Word 3
|
Word 4
|
Word 5
|
Word 6
|
Set Preset
|
00
|
03
|
00
|
01 to 20
|
Clear Preset
|
00
|
05
|
00
|
01 to 20
|
Go To Preset
|
00
|
07
|
00
|
01 to 20
|
Flip (180° about)
|
00
|
07
|
00
|
21
|
Go To Zero Pan
|
00
|
07
|
00
|
22
|
Set Auxiliary
|
00
|
09
|
00
|
01 to 08
|
Clear Auxiliary
|
00
|
0B
|
00
|
01 to 08
|
Remote Reset
|
00
|
0F
|
00
|
00
|
Set Zone Start
|
00
|
11
|
00
|
01 to 08
|
Set Zone End
|
00
|
13
|
00
|
01 to 08
|
Write Char. To Screen
|
00
|
15
|
X Position 00 to 28
|
ASCII Value
|
Clear Screen
|
00
|
17
|
00
|
00
|
Alarm Acknowledge
|
00
|
19
|
00
|
Alarm No.
|
Zone Scan On
|
00
|
1B
|
00
|
00
|
Zone Scan Off
|
00
|
1D
|
00
|
00
|
Set Pattern Start
|
00
|
1F
|
00
|
00
|
Set Pattern Stop
|
00
|
21
|
00
|
00
|
Run Pattern
|
00
|
23
|
00
|
00
|
Set Zoom Speed
|
00
|
25
|
00
|
00 to 03
|
Set Focus Speed
|
00
|
27
|
00
|
00 to 03
|
Please note that in Pelco P implementaiton the checksum is calculation by XOR binary sum of bytes 1-7, but in D one is by madulo 256 sum of bytes 1-6
0A 00001010
88 10001000
Subtotal 10010010 92
90 10010000
Subtotal 00100010 22 (modulo 256 allows the high bit to roll off)
00 00000000
Subtotal 00100010 22
40 01000000
01100010 62
Final check sum valueThe response of devices not really important, however just to know that in Pelco P the response will be ACK command, when in Pelco D the response to one of these commands is four bytes long. The first byte is the synchronization character (FF), the second byte is the receiver address, the third byte contains the alarm information and the fourth byte is the check sum.
訂閱:
文章 (Atom)