Thanks to Netty's excellent design and encapsulation, developing a high-performance network program has become very simple. This article briefly introduces several core components of Netty from a simple server-side implementation, hoping to be helpful to you. Quickly implement a serverWe hope to quickly implement a simple master-slave reactor model through Netty. The thread group corresponding to the master reactor receives the connection and the acceptor creates the connection. The read and write events of the client established with it will be handled by the thread pool corresponding to the slave reactor: Based on this design, we write the following code through Netty, and you can see that we have done the following things:
Finally, call bind to start the server and listen to the connection results asynchronously through addListener: For the data sent by the client, we will add sequential processing through ChannelInboundHandlerAdapter. As shown in the code, our execution order is InboundHandlerA->InboundHandlerB->ServerHandler. For this, we give the code of InboundHandlerA. The content of InboundHandlerB is the same and will not be shown: And ServerHandler is:
Parse the client's data and reply with Hello Netty client: We used telnet to find the following output from the server, which is consistent with what we described above:
Then we send message 1, and we can see that all inbound channelRead methods are triggered: Then we reply hello netty client, triggering OutBoundHandler according to the added flashback: Detailed explanation of the core components in NettyChannel InterfaceChannel is Netty's encapsulation of primitives such as bind, connect, read, and write in the underlying class socket, which simplifies the complexity of our network programming. At the same time, Netty also provides a variety of ready-made channels that we can use according to our personal needs. The following are several commonly used channels in TCP or UDP that I often use.
EventLoop InterfaceIn Netty, all channels are registered to an eventLoop. Each EventLoopGroup has one or more EventLoops, and each EventLoop is bound to a thread responsible for processing events of one or more channels: Here we also briefly give the run method in NioEventLoop, which inherits from SingleThreadEventExecutor. We can roughly see that the core logic of NioEventLoop is essentially to poll all channels (socket abstractions) registered to NioEventLoop to see if there are ready events, and then
pipeline and channelHandler with channelHandlerContextEach channel event will be handled by the channelHandler, and the channelHandlers responsible for the same channel will be connected by a logical chain of pipeline. The relationship between the two will be encapsulated into channelHandlerContext. ChannelHandlerContext is mainly responsible for the interaction between the current channelHandler and other channelHandlers on the same channelpipeline. For example, when we receive write data from the client, the data will be handled by the channelHandler on the pipeline. As shown in the following figure, after the first channelHandler completes the processing, each channelHandlerContext will forward the message to the next channelHandler of the current pipeline for processing: Assuming that our channelHandler executes ChannelActive, if we want to continue propagation, we will call fireChannelActive:
Looking at its internal logic, we can see that it gets the next ChannelHandler of the pipeline through AbstractChannelHandlerContext and executes its channelActive method:
The idea of callbackWe can say that callback is actually a design idea. Netty is asynchronous and non-blocking for connection or read and write operations, so we hope to perform some response processing when the connection is established. Then Netty will expose a callback method when the connection is established for users to implement personalized logic. For example, when our channel connection is established, the underlying layer will call invokeChannelActive to obtain our bound ChannelInboundHandler and execute its channelActive method: Then the channelActive method of our server-side ServerHandler will be called: Future asynchronous monitoringTo ensure the efficiency of network server execution, most of Netty's network IO operations are asynchronous. Taking the listener I established for connection settings as an example, after the current connection is successful, a java.util.concurrent.Future will be returned to the listener. We can use this f to obtain whether the connection result is successful: We step into addListener of DefaultPromise and find that it adds a listener to determine whether the connected asynchronous task Future is completed. If completed, notifyListeners is called to callback the logic of our listener: |
When you enter a web address or uniform resource ...
[[420026]] Recently, Mobile China learned that ev...
Looking back at the development of the network ma...
It's the last day of the holiday. Did you get...
According to the 3GPP agreement, 5G networks will...
This article systematically sorts out and analyze...
[[409633]] This article is reprinted from the WeC...
Miao Wei, deputy director of the Economic Committ...
[[357291]] Preface First, let’s take a look at a ...
Recently, Ruijie Networks released two new servic...
This month, spinservers is offering a flash sale ...
At the end of June, MWC19 Shanghai was once again...
[[384427]] This article intends to discuss gatewa...
DingTalk made its debut in Japan. "Well... t...
Thanks to advances in artificial intelligence (AI...