Prostate particle implantation path visualization method and system based on augmented reality

文档序号:1029506 发布日期:2020-10-30 浏览:8次 中文

阅读说明:本技术 基于增强现实的***粒子植入路径可视化方法及系统 (Prostate particle implantation path visualization method and system based on augmented reality ) 是由 张永德 杨健智 左思浩 于 2020-07-10 设计创作,主要内容包括:本发明涉及医疗导航领域,公开了一种基于增强现实的前列腺粒子植入路径可视化方法及系统,包括以下步骤:根据术前资料,构建虚拟二维切面;完成器械标定;设定系统,获取相机的外参数;建立病人与所述二维切面的关系,并提取所述二维切面;实时拍摄术区场景,提取图像中的标记物;根据所述标记物完成所述二维切面和图像的虚实配准;将配准后的二维切面投影到病人术区体表;检测所述粒子植入器的位置,投射光点完成可视化;本发明可以实现病人术区冠状面上的粒子植入路径可视化,还将可视化影像投影在病人体表,保证了医生的手眼一致性,避免植入过程过分依赖医生的个人经验和想象,提高手术成功率。(The invention relates to the field of medical navigation, and discloses a prostate particle implantation path visualization method and system based on augmented reality, which comprises the following steps: constructing a virtual two-dimensional section according to preoperative data; completing instrument calibration; setting a system, and acquiring external parameters of a camera; establishing the relationship between the patient and the two-dimensional tangent plane, and extracting the two-dimensional tangent plane; shooting a surgical area scene in real time, and extracting markers in the image; completing virtual-real registration of the two-dimensional section and the image according to the marker; projecting the registered two-dimensional tangent plane to the body surface of the operation area of the patient; detecting the position of the particle implanter, and projecting a light spot to complete visualization; the invention can realize the visualization of the implantation path of the particles on the coronal plane of the operation area of the patient, and projects the visualized image on the body surface of the patient, thereby ensuring the consistency of hands and eyes of doctors, avoiding the excessive dependence on the personal experience and imagination of the doctors during the implantation process and improving the success rate of the operation.)

1. An augmented reality-based prostate particle implantation path visualization method, comprising:

constructing a virtual two-dimensional section according to preoperative data;

completing instrument calibration;

setting a system, and acquiring external parameters of a camera;

establishing the relationship between the patient and the two-dimensional tangent plane, and extracting the two-dimensional tangent plane;

shooting a surgical area scene in real time, and extracting markers in the image;

completing virtual-real registration of the two-dimensional section and the image according to the marker;

projecting the registered two-dimensional tangent plane to the body surface of the operation area of the patient;

and detecting the position of the particle implanter, and projecting a light spot to complete visualization.

2. The augmented reality-based prostate particle implantation path visualization method according to claim 1, wherein the preoperative data includes:

the preoperative data are a series of CT or MRI images of organ tissues in the operation area of a patient, the images comprise markers adhered to the body surface of the front face of the operation area of the patient, wherein the operation area refers to the body part of the patient from perineum to diaphragm.

3. The augmented reality-based prostate particle implantation path visualization method according to claim 1, wherein the virtual two-dimensional slice comprises:

The virtual two-dimensional section is a series of sections of organ tissues of an operation area of a patient on a coronal plane, each section at different heights in the operation area of the patient draws a same virtual marker according to the shape and the position of a body surface marker, and draws corresponding particle implantation target points and implantation paths on the sections at different heights in combination with a preoperative planning scheme, wherein the division at different heights is carried out according to the coronal plane at different positions where a series of particle implantation target points are located in the preoperative planning.

4. The augmented reality-based prostate particle implantation path visualization method according to claim 1, wherein the instrument calibration comprises:

calibrating a binocular camera, and acquiring internal parameters and distortion parameters of the binocular camera by using a Zhang friend calibration method;

calibrating the particle implanter, namely unifying needle point coordinates of an outer needle of the particle implanter in an electromagnetic positioning coordinate system by utilizing electromagnetic positioning;

calibrating the light spot projector, namely unifying the center coordinates of the light spot projector in an electromagnetic positioning coordinate system by utilizing electromagnetic positioning;

the calibration is carried out on the projector, and the projector is connected to the light spot projector, so that the coordinates of the center of the projector in the electromagnetic positioning coordinate system can be indirectly obtained by calibrating the light spot projector.

5. The augmented reality-based prostate particle implantation path visualization method according to claim 1, wherein the establishing the relationship between the patient and the two-dimensional slice comprises:

a doctor moves a particle implanter to a first radioactive particle implantation position in front of a guide plate according to a preoperative planning scheme, real world coordinates of a needle point of an outer needle of the particle implanter are detected, so that the real world coordinate information of the needle point of the outer needle of the particle implanter is associated with a virtual two-dimensional tangent plane corresponding to a first implantation target point, and the corresponding virtual two-dimensional tangent plane can be obtained by detecting the change of the vertical distance of the needle point of the outer needle of the particle implanter between coronal planes with different heights in the subsequent operation process.

6. The augmented reality-based prostate particle implantation path visualization method according to claim 1, wherein the surgical field scene comprises:

the range of the operation area scene should cover the body surface of the operation area on the front side of the patient and the chessboard pattern calibration plate used for calibration.

7. The augmented reality-based prostate particle implantation path visualization method according to claim 1, wherein the virtual-real registration comprises:

Before virtual-real registration, a quaternion method is used for solving a conversion relation between a virtual world coordinate system and a real world coordinate system according to real world coordinates of feature points in the operation area scene image and virtual world coordinates of feature points in a two-dimensional tangent plane;

the virtual-real registration can be realized only by completing the sequential conversion from the virtual world coordinate system to the real world coordinate system, the camera coordinate system and the imaging coordinate system by the two-dimensional tangent plane.

8. The augmented reality-based prostate particle implantation path visualization method according to claim 1, wherein the projecting onto the surface of the patient's operative region comprises:

and the host controls the projector to finish projection according to a coordinate system conversion result obtained by the two-dimensional section in virtual-real registration.

9. The augmented reality-based prostate particle implantation path visualization method according to claim 1, wherein the projection light points complete visualization, comprising:

each light spot is projected by a laser diode in the light spot projector, and a series of light spots form a traveling path of the needle point of the outer needle of the particle implanter on the coronal plane.

10. An augmented reality based prostate particle implantation path visualization system, comprising:

The light spot projector is used for projecting light spots onto the body surface of the front operation area of the patient and using a series of light spots to express the real target points and paths of the implantation process of the prostate particles;

an electromagnetic transmitter for transmitting electromagnetic waves;

the binocular camera is used for acquiring images of the surgical field scene in real time;

the electromagnetic receiver is used for receiving the electromagnetic waves transmitted by the electromagnetic transmitter and then generating currents with corresponding magnitudes and phases;

a seed implanter for implanting radioactive seeds into a lesion of the prostate;

the guide plate is used for assisting and guiding a doctor to insert an outer needle of the particle implanter;

the sickbed is used for fixing the lithotomy position posture of the patient;

the electromagnetic positioner is used for connecting the electromagnetic receivers, analyzing the magnitude and the phase of current generated by the electromagnetic receivers and determining the position relationship between each electromagnetic receiver and each electromagnetic transmitter;

the chessboard marking board is used for assisting in obtaining the internal parameters and distortion parameters of the binocular camera before operation and is used for assisting in obtaining the external parameters of the binocular camera during operation;

the computer display screen is used for browsing preoperative data and a virtual two-dimensional section;

the projector is used for projecting a virtual two-dimensional section onto the body surface of the front operation area of the patient;

The host computer is used for processing preoperative data, drawing and storing a virtual two-dimensional section, receiving, identifying and processing an intraoperative scene in real time, completing virtual-real registration, receiving and analyzing coordinate data determined by electromagnetic positioning, controlling the projector to complete image projection and controlling the light spot projector to complete light spot projection.

Technical Field

The invention relates to the field of medical navigation, in particular to a prostate particle implantation path visualization method and system based on augmented reality.

Background

Relevant data show that the number of prostate cancer patients in China is rapidly increasing in recent years, so that effective treatment of prostate cancer faces increasingly serious challenges. The currently clinically common treatment method is that a doctor puts an ultrasonic probe into the rectum, manually adjusts the particle implanter with the assistance of an ultrasonic image and a guide plate, and implants radioactive particles into a target spot. In the treatment process, because the patient lies on the bed in the posture of the lithotomy position, a doctor can only obtain the image information on the cross section and the sagittal plane of the operation area of the patient and cannot obtain the image information on the coronal plane of the operation area of the patient by switching the working mode of the ultrasonic probe, further, the information of the implantation target point and the implantation path on the coronal plane in the prostate brachytherapy cannot be known by the doctor, the operation precision is influenced, the particle implantation process excessively depends on the personal experience and imagination of the doctor, the decision pressure of the doctor can be increased, and the doctor is easy to fatigue.

In order to solve the above problems, the prior art proposes to use virtual reality to present a three-dimensional virtual image of an operation area of a patient to assist a doctor in realizing more accurate particle implantation operation, but the use of virtual reality requires the doctor to spend more time before an operation to construct a realistic three-dimensional virtual image, the three-dimensional virtual image is displayed in a non-intuitive and non-in-situ manner, and the doctor needs to switch a visual field to observe a display screen of the three-dimensional virtual image while observing an ultrasound image and the operation area of the patient.

Disclosure of Invention

In order to overcome the defects in the prior art, the invention provides the prostate particle implantation path visualization method and system based on augmented reality, which can provide the visualization information of organ tissues and the particle implantation path on the coronal plane of the operation area of a patient for a doctor by utilizing the augmented reality technology, so that the doctor can acquire more information in the operation, and the visualization system ensures the consistency of hands and eyes of the doctor, reduces the pressure of the doctor and improves the success rate of the operation.

In order to achieve the purpose, the technical scheme of the invention is realized in such a way.

An augmented reality-based prostate particle implantation path visualization method, comprising:

constructing a virtual two-dimensional section according to preoperative data;

completing instrument calibration;

setting a system, and acquiring external parameters of a camera;

establishing the relationship between the patient and the two-dimensional tangent plane, and extracting the two-dimensional tangent plane;

shooting a surgical area scene in real time, and extracting markers in the image;

completing virtual-real registration of the two-dimensional section and the image according to the marker;

projecting the registered two-dimensional tangent plane to the body surface of the operation area of the patient;

and detecting the position of the particle implanter, and projecting a light spot to complete visualization.

Preferably, the preoperative data includes:

the preoperative data are a series of CT or MRI images of organ tissues in the operation area of a patient, the images comprise markers adhered to the body surface of the front face of the operation area of the patient, wherein the operation area refers to the body part of the patient from perineum to diaphragm.

Preferably, the virtual two-dimensional section includes:

the virtual two-dimensional section is a series of sections of organ tissues of an operation area of a patient on a coronal plane, each section at different heights in the operation area of the patient draws a same virtual marker according to the shape and the position of a body surface marker, and draws corresponding particle implantation target points and implantation paths on the sections at different heights in combination with a preoperative planning scheme, wherein the division at different heights is preferably performed according to the coronal plane at different positions where a series of particle implantation target points are located in the preoperative planning, and the instrument calibration comprises:

calibrating a binocular camera, and acquiring internal parameters and distortion parameters of the binocular camera by using a Zhang friend calibration method;

calibrating the particle implanter, namely unifying needle point coordinates of an outer needle of the particle implanter in an electromagnetic positioning coordinate system by utilizing electromagnetic positioning;

Calibrating the light spot projector, namely unifying the center coordinates of the light spot projector in an electromagnetic positioning coordinate system by utilizing electromagnetic positioning;

the calibration is carried out on the projector, and the projector is connected to the light spot projector, so that the coordinates of the center of the projector in the electromagnetic positioning coordinate system can be indirectly obtained by calibrating the light spot projector.

Preferably, the establishing of the relationship between the patient and the two-dimensional section comprises:

a doctor moves a particle implanter to a first radioactive particle implantation position in front of a guide plate according to a preoperative planning scheme, real world coordinates of a needle point of an outer needle of the particle implanter are detected, so that the real world coordinate information of the needle point of the outer needle of the particle implanter is associated with a virtual two-dimensional tangent plane corresponding to a first implantation target point, and the corresponding virtual two-dimensional tangent plane can be obtained by detecting the change of the vertical distance of the needle point of the outer needle of the particle implanter between coronal planes with different heights in the subsequent operation process.

Preferably, the surgical field scene includes:

the range of the operation area scene should cover the body surface of the operation area on the front side of the patient and the chessboard pattern calibration plate used for calibration.

Preferably, the virtual-real registration comprises:

before virtual-real registration, a quaternion method is used for solving a conversion relation between a virtual world coordinate system and a real world coordinate system according to real world coordinates of feature points in the operation area scene image and virtual world coordinates of feature points in a two-dimensional tangent plane;

the two-dimensional section can realize virtual-real registration only by completing the sequential conversion from the virtual world coordinate system to the real world coordinate system, the camera coordinate system and the imaging coordinate system

Preferably, the projection onto the body surface of the surgical field of the patient comprises:

and the host controls the projector to finish projection according to a coordinate system conversion result obtained by the two-dimensional section in virtual-real registration.

Preferably, the projected light spot performs visualization, including:

each light spot is projected by a laser diode in the light spot projector, and a series of light spots form a traveling path of the needle point of the outer needle of the particle implanter on the coronal plane.

An augmented reality based prostate particle implantation path visualization system, comprising:

the light spot projector is used for projecting light spots onto the body surface of the front operation area of the patient and using a series of light spots to express the real target points and paths of the implantation process of the prostate particles;

An electromagnetic transmitter for transmitting electromagnetic waves;

the binocular camera is used for acquiring images of the surgical field scene in real time;

the electromagnetic receiver is used for receiving the electromagnetic waves transmitted by the electromagnetic transmitter and then generating currents with corresponding magnitudes and phases;

a seed implanter for implanting radioactive seeds into a lesion of the prostate;

the guide plate is used for assisting and guiding a doctor to insert an outer needle of the particle implanter;

the sickbed is used for fixing the lithotomy position posture of the patient;

the electromagnetic positioner is used for connecting the electromagnetic receivers, analyzing the magnitude and the phase of current generated by the electromagnetic receivers and determining the position relationship between each electromagnetic receiver and each electromagnetic transmitter;

the chessboard marking board is used for assisting in obtaining the internal parameters and distortion parameters of the binocular camera before operation and is used for assisting in obtaining the external parameters of the binocular camera during operation;

the computer display screen is used for browsing preoperative data and a virtual two-dimensional section;

the projector is used for projecting a virtual two-dimensional section onto the body surface of the front operation area of the patient;

the host computer is used for processing preoperative data, drawing and storing a virtual two-dimensional section, receiving, identifying and processing an intraoperative scene in real time, completing virtual-real registration, receiving and analyzing coordinate data determined by electromagnetic positioning, controlling the projector to complete image projection and controlling the light spot projector to complete light spot projection.

The invention provides a prostate particle implantation path visualization method and system based on augmented reality, which are characterized in that a virtual two-dimensional section is constructed by using preoperative data, instrument calibration is carried out, preoperative preparation work of a visualization system is completed, the system is started during surgery to obtain external parameters of a camera, then the relationship between a patient and the two-dimensional section is established, the two-dimensional section is extracted, a surgery area scene is shot, a marker in an image is extracted, virtual-real registration of the two-dimensional section and the image is completed according to the marker, the registered two-dimensional section is projected to the surface of a surgery area of the patient, and finally, visualization is completed by projecting light points through detecting the position of a particle implanter. Therefore, a doctor can know the visual information of the organ tissues and the particle implantation path on the coronal plane of the operation area of the patient in the operation, the influence on the operation precision caused by the information loss is avoided, the visual information is directly projected on the body surface of the frontal operation area of the patient by utilizing the augmented reality technology, the intuition and the in-situ performance of the visual information are ensured, the dependence degree of the operation process on the personal experience and imagination of the doctor is reduced, the mental stress of the doctor is finally relieved, and the operation success rate is improved.

Drawings

For ease of illustration, the invention is described in detail by the following detailed description and the accompanying drawings.

FIG. 1 is a schematic flow chart of the visualization method of the present invention.

Fig. 2 is a schematic diagram of the visualization system of the present invention.

Fig. 3 is a flow chart of the preoperative preparation of the present invention.

Fig. 4 is a flow chart for visualizing the intraoperative particle implantation path of the present invention.

Fig. 5 is a flow chart of virtual-real registration of the present invention.

Detailed Description

In order to make the technical solutions of the embodiments of the present invention better understood and make the above objects, features and advantages of the present invention more comprehensible, it is described in detail below with reference to the accompanying drawings in the embodiments of the present invention. It is to be understood that the embodiments described are only a few embodiments of the present invention, and not all embodiments. All other embodiments, which can be obtained by a person skilled in the art without any inventive step based on the embodiments of the present invention, are within the scope of the present invention.

As shown in fig. 1, it is a schematic flow chart of the visualization method of the present invention, which specifically includes:

step S101, constructing a virtual two-dimensional section according to preoperative data;

Step S102, completing instrument calibration;

step S103, setting a system, and acquiring external parameters of the camera;

step S104, establishing the relationship between the position of the patient and the two-dimensional section;

step S105, extracting a two-dimensional section according to the relation and the real-time position of the particle implanter;

step S106, shooting a surgical area scene in real time, and extracting a marker in the image;

step S107, completing virtual-real registration of the two-dimensional section and the image according to the marker;

step S108, projecting the two-dimensional section after registration to the body surface of the operation area of the patient;

step S109, detecting the position of the particle implanter, and projecting a light spot to complete visualization.

In the embodiment of the present invention, the visualization system of the implantation path of prostate particles can be divided into two stages: a preoperative preparation phase and an intraoperative navigation phase. Step S101 and step S102 both belong to the preoperative preparation phase, and the remaining steps belong to the intraoperative navigation phase.

In step S101, since the acquired preoperative data are CT or MRI images, these image data are tomographic images, and the virtual two-dimensional slice constructed from the preoperative data has the same interval in the content connection.

Further, the instrument calibration to be completed in step S102 has two main contents: completing calibration and correction of a binocular camera, and acquiring internal parameters and correction parameters of the binocular camera; coordinate information of the needle point of the outer needle of the particle implanter, the light spot projector and the projector is unified under an electromagnetic positioning coordinate system.

Further, the setting system in step S103 mainly sets the detection range and the working range with respect to the light spot projector, and determines the positional relationship between the binocular camera and the patient by calculating the extrinsic parameters of the binocular camera.

Further, the relationship between the patient position and the two-dimensional tangent plane in step S104 is actually established by establishing a relationship between the position information of the first target implantation site of the patient during the operation and the one two-dimensional tangent plane having the same implantation target information in the repository, and after the relationship is established, the one two-dimensional tangent plane can be regarded as a reference tangent plane, and after the needle point of the outer needle of the particle implanter moves a certain distance in the real world, the two-dimensional tangent plane having the same distance information as the reference tangent plane is extracted from the repository according to the distance information.

Further, in step S105, the two-dimensional slice needs to be extracted from the storage library of the host.

Further, in step S106, a binocular camera is used to photograph the surgical field scene in real time, and then the obtained image is transmitted to a host computer, and the host computer performs the marker identification and extraction in the image.

Further, in step S107, a quaternion algorithm is used to convert the two-dimensional section from the virtual world coordinate system to the real world coordinate system, then from the real world coordinate system to the camera coordinate system, and then from the camera coordinate system to the pixel coordinate system, thereby completing the virtual-real registration.

Further, in step S108, the projection of the two-dimensional slice is performed by the projector, and the projection range and the projection angle of the projector are controlled by the host according to the information obtained after the virtual-real registration.

Further, in step S109, a highly focused laser is used to simulate the path and target information for the implantation of the particle.

As shown in fig. 2, it is a schematic diagram of the visualization system of the present invention, and specifically includes:

the light spot projector 201 is used for projecting light spots onto the body surface of the front operation area of the patient, and a series of light spots are used for representing the real target points and paths of the implantation process of the prostate particles;

an electromagnetic transmitter 202 for transmitting electromagnetic waves;

the binocular camera 203 is used for acquiring images of the surgical field scene in real time;

the electromagnetic receiver 204 is used for receiving the electromagnetic wave emitted by the electromagnetic emitter 202 and then generating a current with a corresponding magnitude and phase, so that the system is assisted to complete the tracking and positioning of the particle implanter 205;

a seed implanter 205 for implanting radioactive seeds into a lesion of the prostate;

a guide plate 206 for assisting and guiding a doctor to insert an outer needle of the particle implanter 205;

a patient bed 207 for fixing the posture of the patient in the lithotomy position;

The checkerboard calibration board 208 is used for assisting in acquiring internal parameters and distortion parameters of the binocular camera 203 before operation and acquiring external parameters of the binocular camera 203 during operation;

the host computer 209 is used for processing preoperative data, drawing and storing a virtual two-dimensional section, receiving, processing and identifying an intraoperative scene in real time, completing virtual-real registration of the virtual two-dimensional section and the intraoperative scene, receiving and analyzing the position relationship determined by the electromagnetic positioner 209, controlling the projector 212 to complete enhanced projection, and controlling the light spot projector 201 to complete light spot projection;

the electromagnetic positioner 210 is used for connecting the electromagnetic receiver 204 and the electromagnetic receiver 213, analyzing the magnitude and the phase of the current generated by the electromagnetic receiver 204 and the electromagnetic receiver 213 and determining the position relationship among the electromagnetic receiver 204, the electromagnetic receiver 213 and the electromagnetic transmitter 202;

a computer display screen 211 for browsing preoperative data and virtual two-dimensional sections;

a projector 212 for projecting a virtual two-dimensional section onto a body surface of the operative region on the front of the patient;

and the electromagnetic receiver 213 is used for receiving the electromagnetic wave emitted by the electromagnetic emitter 202 and then generating a current with a corresponding magnitude and phase, so that the auxiliary system completes the tracking and positioning of the spot projector 201 and indirectly completes the tracking and positioning of the projector 212.

In the embodiment of the invention, the light spot projector 201 is of a cuboid structure, and the length of the light spot projector 201 in the horizontal plane direction is slightly longer than the body width of a patient, so that the projection range of the light spot projector 201 can completely cover the body surface of the front operation area of the patient. In addition, the inside of the light spot projector 201 is provided with a plurality of laser diodes which are uniformly distributed according to a rectangle, and the power of each laser diode is less than 5mw, so that the emitted laser does not bring burning sensation to a patient, and the size of the light spot emitted by each laser diode is consistent with the size of the target point of the particle implantation. In operation, the light spot projector 201 is suspended right above the operation area on the front of the patient, and the suspension height of the light spot projector 201 is adjustable to ensure that the projection range can just cover the body surface of the operation area on the front of the patient in the operation process. The spot projector 201 is also provided with an electromagnetic receiver 213, and after the calibration of the instrument, the position change of the spot projector 201 in the real world coordinate system can be determined through the current change generated after the electromagnetic wave is received by the electromagnetic receiver 213. In addition, the light spot projector 201 is connected to the host 209 through a VGA interface or a wireless network, and when the light spot is projected, the host 209 receives the position information of the particle implanter 205 in real time, and controls the light emission and the light extinction of the laser diode in the light spot projector 201 according to the position information.

In surgery, the binocular camera 203 needs to be placed on the left or right side of the patient to acquire images of the intraoperative scene in real time. The height of the binocular camera 203 can be adjusted, the adjusted height should ensure that the binocular camera 203 can completely shoot the scene of the surgical area, and the binocular camera 203 sends the image to the host 209 for processing after acquiring the image of the scene of the surgical area. In addition, an electromagnetic transmitter 202 is fixed above the binocular camera, and in the present system, the electromagnetic transmitter 202 is set as an origin in real world coordinates. Also, after the extrinsic parameters of the binocular camera 203 are acquired intraoperatively, the positions of the binocular camera 203 and the electromagnetic transmitter 202 relative to the patient will not change any further.

The seed implanter 205 is used to implant radioactive seeds into the lesion of the prostate, and all structures of the seed implanter 205 except the outer needle and the inner needle are made of non-metallic materials in order to reduce the influence of metal on the electromagnetic field. The electromagnetic receiver 204 is fixed at the tail end of the particle implanter 205, and after the instrument calibration, the position change of the needle point of the outer needle of the particle implanter 205 in the real world coordinate system can be determined through the current change generated after the electromagnetic wave is received by the electromagnetic receiver 204.

A thigh support is fixed on the patient bed 207 for ensuring the posture of the patient at the lithotomy position, and the guide plate 206 is fixed on the patient bed 207 through a connecting device and is vertically placed in front of the perineum of the patient for helping a doctor to insert the outer needle of the particle implanter 205 through the perineum of the patient to a preset target position.

In addition, a checkerboard calibration plate 208 is fixed on the other side of the patient bed 207 opposite to the direction in which the binocular camera 203 is placed, the checkerboard calibration plate 208 is a square opaque cardboard, and two squares, namely black and white, are painted on the checkerboard calibration plate 208. The checkerboard calibration plate 208 may be removed from the patient bed 207 prior to surgery to assist in acquiring the intrinsic and distortion parameters of the binocular camera 203, and the checkerboard calibration plate 208 may be removed from the patient bed 207 after the extrinsic parameters are acquired intraoperatively.

The electromagnetic positioner 210 is positioned above the host 209 and is connected to the electromagnetic receiver 204, the electromagnetic receiver 213, and the host 209. The current generated by the electromagnetic receiver 204 and the electromagnetic receiver 213 is transmitted to the electromagnetic positioner 210 as input information, the electromagnetic positioner 210 can determine the position relationship among the electromagnetic receiver 204, the electromagnetic receiver 213 and the electromagnetic transmitter 202 by analyzing the magnitude and the phase of the current, then the electromagnetic positioner 210 transmits the position relationship to the host 209 as output information, and the host 209 can know the position relationship among the binocular camera 203, the particle implanter 205, the light spot projector 201 and the projector 212 in the real world coordinate system according to the position information and the instrument calibration result.

The physician may use the computer screen 211 to view preoperative data and complete preoperative planning, then process the preoperative data and render a virtual two-dimensional slice in the software operation interface displayed by the computer screen 211 and store the virtual two-dimensional slice in the repository of the host 209.

Projector 212 is the same size as spot projector 201 and is connected with spot projector 201, so when instrument calibration is performed on spot projector 201 and electromagnetic receiver 213, instrument calibration of projector 212 can be indirectly completed, and tracking and positioning of projector 212 are realized. In addition, the projector 212 is connected to the host 209 through a VGA interface or a wireless network, and when projecting, the host 209 sends a control signal to control the projector 212 to project an image to a specific area.

Further, the host 209, in addition to performing the above functions, uses the CPU unit to perform the operation of virtual registration according to the operation program stored in the repository and the extra-camera parameters and image features obtained in the operation, so as to obtain the transformation matrix after the virtual-real registration.

As shown in fig. 3, it is a preoperative preparation flowchart of the present invention, specifically including:

step S301, calibrating a camera and correcting distortion;

Step S302, acquiring a two-dimensional medical image of a surgical area of a patient;

step S303, preprocessing an image, drawing and rendering the image into a three-dimensional model;

step S304, drawing an implantation target point and a path in the three-dimensional model;

step S305, converting the three-dimensional model into a two-dimensional tangent plane on a coronal plane;

step S306, calibrate the particle implanter, the light spot projector, and the projector.

The embodiment of the invention totally comprises three independent preoperative preparation contents, and the three contents are not completed in sequence.

In step S01, a gnomon calibration method is used to obtain the intrinsic parameters and distortion parameters of the binocular camera 203, and during calibration correction, the binocular camera 203 is required to shoot several images of the checkerboard calibration boards 208 from different angles, and the number of images should not be less than eight in order to ensure that stable results can be obtained. In addition, after the checkerboard calibration board 208 is placed, any one of the corner points on the checkerboard calibration board is required to be set as the origin of the real world coordinate system, then four vertex angles of the checkerboard calibration board 208 are identified as feature points, and coordinate values of the four feature points in the real world coordinate system are measured.

Further, an angular point detection method is used for extracting image coordinates of the feature points in an image coordinate system, then the real coordinates of the feature points are combined, the internal parameters and the external parameters of the binocular camera 203 under the ideal distortion-free condition are solved, and the precision is improved by using maximum likelihood estimation.

Specifically, the intrinsic parameters refer to a focal length of the binocular camera 203, coordinates of an optical center in a pixel coordinate system, a physical length of one pixel in the X direction, and a physical length of one pixel in the Y direction. In addition, the extrinsic parameters are used to represent the position relationship between the binocular camera 203 and the checkerboard calibration board 208 in the real world coordinate system, the extrinsic parameters are calculated from three rotation parameters and three translation parameters, but the extrinsic parameters vary with the position of the binocular camera 203, so the extrinsic parameters obtained in step S301 are only used to optimize the intrinsic parameters and distortion parameters of the binocular camera 203.

Further, a least square method is used for solving radial distortion parameters of the binocular camera 203, then the inner parameters and the outer parameters of the binocular camera 203 are combined, the precision of the inner parameters and the precision of the radial distortion parameters are improved again by using a maximum likelihood method, and therefore calibration and distortion correction of the binocular camera 203 in preoperative preparation are completed.

Step S302 to step S305 belong to an image acquisition and model drawing part, in step S302, the doctor needs to paste black hemispherical opaque plastics with a diameter of one centimeter which is larger than five and smaller than fifteen horizontally in two rows on the body surface between the navel of the operation area of the patient and the top connecting line of two groins as markers, then acquire a series of CT or MRI images of the operation area of the patient, and store the images as the pre-operation data in the DICOM format.

Specifically, the operation region is a body part of the patient from the perineum to the diaphragm, and furthermore, the sticking position of the marker cannot block the prostate when viewed from the coronal plane.

Further, in step S303, the series of images obtained in step S302 are denoised by using an image enhancement technique, and then organ tissues in the images are segmented by using a threshold segmentation method, thereby completing the image preprocessing process. Then, the doctor guides the preprocessed image into 3Dslicer software, and uses a surface drawing method to draw a virtual three-dimensional model, wherein all organ tissues in the three-dimensional model need to be rendered by light tones, different organ tissues use different colors to be rendered, but white rendering cannot be used, and black is used for rendering markers in the image.

Further, step S304 is to draw, in combination with the virtual three-dimensional model and the preoperative planning scheme, additional implantation target points and implantation paths in the 3d slicer software, wherein rendering colors of the implantation target points and the implantation paths are different from each other, each implantation target point has a corresponding implantation path, each implantation target point and the corresponding implantation path are on the same coronal plane, and then rendering is performed using a dark color except black. The preoperative planning scheme is a series of particle implantation target points and implantation paths determined by doctors according to the preoperative data before rendering a virtual three-dimensional model.

Specifically, the thickness of the mapped implantation path should be consistent with the outer diameter of the outer needle of the particle implanter 205, while the size of the mapped target implantation point should be consistent with the inner diameter of the outer needle of the particle implanter 205.

Step S305 converts the rendered virtual three-dimensional model rendered in step S304 into a series of two-dimensional slices according to the coronal plane where each implantation target point is located, renders a marker having the same size, color, and position for each two-dimensional slice according to the coordinate information of the marker in the virtual three-dimensional model on the coronal plane, and finally stores the two-dimensional slices in the storage library of the host 209, thereby completing image acquisition and model rendering in the preoperative preparation.

Since the present system sets the electromagnetic transmitter 202 as the origin of the real-world coordinate system, the positions of all electromagnetic receivers in the electromagnetic positioning coordinate system can be considered as the positions in the real-world coordinate system.

Specifically, in step S306, the electromagnetic receiver 204 is fixed at the end of the particle implanter 205, after the electromagnetic transmitter 202 transmits electromagnetic waves outwards, the electromagnetic receiver 204 receives the electromagnetic waves and generates a current with a specific magnitude and phase, the magnitude and phase information of the current varies with the position of the electromagnetic receiver 204, the information is transmitted to the electromagnetic positioner 210 connected to the electromagnetic receiver 204, the electromagnetic positioner 210 can determine the position relationship between the electromagnetic transmitter 202 and the electromagnetic receiver 204 according to the received information, the position relationship is transmitted to the host 209 by the electromagnetic positioner 210, and the host 209 determines the real world coordinates of the needle point outside the particle implanter 205 by combining the received position relationship and the distance relationship between the electromagnetic receiver 204 and the needle point outside the particle implanter 205.

In addition, an electromagnetic receiver 213 is fixed at a corner above the light spot projector 201, the operation principle of the electromagnetic receiver 213 is the same as that of the electromagnetic receiver 204, except that after the positional relationship between the electromagnetic receiver 213 and the electromagnetic transmitter 202 is received by the host 209, the host 209 determines the real world coordinates of the light spot projector 201 by combining the received positional relationship and the distance relationship between the center position of the light spot projector 201 and the electromagnetic receiver 213, and since the projector 212 is also fixedly connected with the light spot projector, the host 209 determines the real world coordinates of the projector 212 by combining the distance relationship between the center position of the light spot projector 201 and the center of the projector 212, so that the calibration of the particle implanter 205, the light spot projector 201 and the projector 212 in the preoperative preparation is completed.

As shown in fig. 4, it is a flow chart for visualizing the implantation path of the particles in the operation of the present invention, which specifically includes:

step S401, setting a detection range and a working range of a light spot projector;

step S402, a doctor controls a particle implanter to obtain world coordinates of a calibration plate;

step S403, calibrating and acquiring camera external parameters, and determining a conversion matrix;

step S404, implanting particles in sequence by a doctor;

Step S405, whether the detection range is entered;

step S406, if yes, associating the two-dimensional tangent plane with the information of the patient, and extracting the tangent plane;

step S407, acquiring a real scene of an operating area;

step S408, processing the image and identifying the marker in the image;

step S409, virtual and real registration;

step S410, projecting the registered two-dimensional tangent plane on the body surface of the operation area of the patient;

step S411, after displaying for one second, if yes, returning to step S407;

step S412, whether to enter a working range;

step S413, projecting a light spot to the epidermis of the patient according to the position of the needle point of the outer needle;

step S414, combining a series of light spots into an implantation path of the needle point of the outer needle;

step S415, whether the implantation of the particles is completed, if the implantation of the particles is completed, the visualization of the implantation path of the particles in the operation is finished, otherwise, the step S404 is returned to, and the implantation of the next particles is continued.

In the embodiment of the present invention, the doctor needs to set the detection range and the working range of the light spot projector in step S401, after the visualization system is started in the operation, the doctor can know the position relationship of the binocular camera 203, the particle implanter 205, the light spot projector 201, and the projector 212 in the real world coordinate system through the host 209, and then the doctor needs to set the range from the center position of the light spot projector 201 to the outermost periphery of the light spot projector 201 by one circle of laser diodes in the software operation interface of the computer display screen 211 as the working range of the light spot projector 201. In addition, the doctor needs to manipulate the needle point of the outer needle of the particle implanter 205 to touch the guide plate 206, so as to indirectly obtain the real world coordinates of the guide plate 206, and then in the software operation interface of the computer display 211, the range from the laser diode at the outermost layer of the side of the spot projector 201 close to the guide plate 206 is set as the detection range of the spot projector 201.

Further, before step S402, the checkerboard calibration plate 208 needs to be fixed on the patient bed 207 on one side of the patient, and then the binocular camera 203 is placed on the ground opposite to the checkerboard, and the height of the binocular camera 203 and the distance between the binocular camera 203 and the patient bed 207 are adjusted to ensure that the binocular camera 203 can completely shoot the surgical field scene within the front body surface including the checkerboard calibration plate 208 and the surgical field of the patient. In addition, the physician needs to attach the same marker to the patient as in the preoperative preparation, and the position of the marker to be attached is also consistent with that in the preoperative preparation.

Further, in step S402, the doctor manipulates the four corners of the needle point of the outer needle of the particle implanter 205 touching the checkerboard calibration plate 208, and indirectly determines the real world coordinates of the four corners by obtaining the real world coordinates of the needle point of the outer needle of the particle implanter 205.

Further, in step S403, the inside parameters and distortion parameters of the binocular camera 203 obtained by the zhangnyou calibration method and preoperative preparation are combined to determine the outside parameters of the binocular camera 203.

Specifically, the extrinsic parameters represent the position relationship between the binocular camera 203 and the checkerboard calibration plate 208 in the real world coordinate system during the operation, that is, the conversion relationship from the real world coordinate system where the checkerboard calibration plate 208 is located to the camera coordinate system where the binocular camera 203 is located, the extrinsic parameters include six variables, that is, three rotation parameters and three translation parameters, and the extrinsic parameters may be converted by using a conversion matrix To indicate.

In the transformation matrix, R represents a moment of a rotation parameterMatrix ofEach parameter in the rotational parameter matrix is formed by combining direction cosines of the optical axis of the binocular camera 203 relative to coordinate axes of a real world coordinate system, wherein the direction cosines are composed of three rotational parameter variables:the rotation angle of the optical axis relative to the X axis of the real world coordinate system is represented, phi represents the rotation angle of the optical axis relative to the Y axis of the real world coordinate system, and omega represents the rotation angle of the optical axis relative to the Z axis of the real world coordinate system; and T represents a translation parameter matrix

Figure BDA0002579799880000064

Wherein: t is txRepresenting a translation parameter in the X direction, tyRepresenting a translation parameter in the Y direction, tzRepresenting the translation parameters in the Z direction.

Furthermore, after the extrinsic parameters are acquired, the positional relationship between the binocular camera 203 and the patient bed 207 in the real world coordinate system will remain unchanged throughout the prostate particle implantation procedure.

Further, in step S404, the physician implants a first radioactive seed according to the preoperative planning scheme.

Further, when the implantation of the particle is started, the electromagnetic positioner 210 of the visualization system performs real-time tracking and positioning on the outer needle point of the particle implanter 205 in step S405, and the host 209 determines whether the outer needle point of the particle implanter 205 enters the detection range of the light spot projector 201.

If the determination result is "yes", in step S406, the host 209 extracts the height information of the outer needle tip of the particle implanter 205 in the real-world coordinate system, associates the height information with a virtual two-dimensional slice stored in the repository of the host 209, and extracts the virtual two-dimensional slice from the repository into the cache of the host 209. Wherein, the virtual two-dimensional section contains the implantation target point and the implantation path which are consistent with the implantation target point and the implantation path of the first radioactive seeds which are implanted by the doctor.

Meanwhile, in the particle implantation process, step S407 acquires an image of the surgical field scene using the binocular camera 203, and transmits the image to the host 209.

Further, in step S408, when the host computer 209 receives the image, image processing from image binarization, connected component analysis, to marker recognition, and the like is sequentially completed for the image.

Further, in step S409, the marker in the image and the marker stored in the virtual two-dimensional section of the buffer area are subjected to virtual-real registration to obtain a transformation matrix after the virtual-real registration.

Further, the host 209 controls the projector 212 to project the virtual two-dimensional section onto the body surface of the frontal surgical area of the patient according to the transformation matrix and the real world coordinates of the projector in step S410, thereby completing the augmented reality display of the coronal plane information on the surgical area of the patient.

Further, step S411 determines whether the projection display of the virtual two-dimensional slice exceeds one second, and if the projection display exceeds one second, the method returns to step S407 to re-acquire the image of the surgical field scene.

Specifically, the position of the marker pasted on the body surface is changed due to the respiration of the patient, so that the accuracy of virtual-real registration is influenced, and the image of the scene in the operation is acquired again at intervals in the operation, so that the accuracy of virtual-real registration is improved, and the visualization effect of the system is ensured.

Further, in step S412, as the doctor continues the prostate particle implantation operation, the needle point of the outer needle of the particle implanter 205 is inserted into the human body from the perineum of the patient, and the host 209 determines whether the needle point of the outer needle of the particle implanter 205 enters the working range of the spot projector 201 according to the position information received from the electromagnetic positioner 210.

If the determination result is "yes", in step S413, the host 209 controls the laser diode of the spot projector 201 to emit light according to the detected real world coordinates of the needle point of the outer needle of the particle implanter 205. Wherein the laser diode is positioned on the intersecting line of the cross section where the needle tip of the outer needle of the particle implanter 205 is positioned and the sagittal plane.

Specifically, when the outer needle point of the particle implanter 205 operates in the operating range of the light spot projector 201, the host 209 determines whether the outer needle point is in a forward rotating state or a backward rotating state at a certain time, that is, whether the outer needle point enters or exits the human body, according to the position information transmitted by the electromagnetic positioner 210. When the outer needle point is in an advancing state, the host 209 controls the corresponding laser diode to emit light according to the position information of the outer needle point, and when the outer needle point enters the next advancing position, the laser diode which emits light still keeps a normally-on state. When the outer needle point is in a retreated state, the host 209 detects the position of the outer needle point in real time and turns off the laser diode corresponding to the retreated path before the position. The laser diode which keeps emitting light always emits laser to be projected on the front body surface of the operation area of the patient.

Further, the series of light points generated in step S413 forms a light ray projected onto the surface of the front surface of the operation area of the patient without interruption, in step S414, the light ray represents the implantation path of the outer needle tip of the particle implanter 205 on the coronal plane, and the far end point of the light ray relative to the guide plate 206 represents the position of the outer needle tip on the coronal plane at this time, so that the physician can be assisted in determining whether the outer needle tip reaches the predetermined target position by the visualized position information and the implantation target point on the projected virtual two-dimensional tangent plane.

Further, in step S415, the visualization system will determine whether the implantation of the particles has been completed, and if the determination result is "yes", the prostate particle implantation path visualization system will complete the work; if the determination is "no," the prostate particle implantation visualization system returns to step S404, and the physician implants the next radioactive particle according to the preoperative plan.

Specifically, after the prostate particle implantation path visualization system returns to step S404, when the process goes to step S406, since the position information of the patient in the real world coordinate system is known, the position information of the patient is not determined by determining the position information of the particle implantation, the change of the height information of the needle point of the outer needle of the particle implanter 205 in the real world coordinate system is directly detected, and then the corresponding virtual two-dimensional slice is extracted from the repository of the host 209.

As shown in fig. 5, it is a flow chart of virtual-real registration in the present invention, which specifically includes:

step S501, extracting virtual coordinates of feature points in a virtual two-dimensional tangent plane;

step S502, calculating the real coordinate of the image characteristic point according to the pixel coordinate of the image characteristic point;

step S503, acquiring a conversion relation between the virtual coordinates and the real coordinates of the feature points;

Step S504, converting the two-dimensional section from a virtual coordinate system to a real coordinate system;

step S505, converting the two-dimensional section from a real coordinate system to a camera coordinate system;

step S506, converting the two-dimensional section from a camera coordinate system to an imaging coordinate system;

in the embodiment of the present invention, after step S406 is completed, the system will extract the virtual coordinates of the marker in the virtual two-dimensional slice in step S501. Because the virtual two-dimensional section and the markers in the section are drawn and rendered before the operation, in the process, a doctor can set the origin of a virtual world coordinate system in a software operation interface, then can acquire the virtual coordinates of the markers in the virtual two-dimensional section by clicking with a mouse, and store the virtual coordinates and the corresponding virtual two-dimensional section in the same area in a storage library. Therefore, when the system extracts the virtual two-dimensional section in the operation, the virtual coordinates of the markers in the virtual two-dimensional section can be simultaneously known, and the virtual coordinates of each marker can be usedTo indicate.

Meanwhile, after step S408 is completed, the pixel coordinates of the marker in the real-time operating field scene image can be known, and then in step S502, the host 209 converts the pixel coordinates from the imaging coordinate system to the camera coordinate system according to the calculation program, and then converts the pixel coordinates from the camera coordinate system to the real coordinate system, so as to obtain the coordinates of the marker in the real coordinate system. Wherein the real coordinate system is a real world coordinate system.

Specifically, the conversion relationship from the camera coordinate system to the imaging coordinate system can be represented by the following conversion matrix:

Figure BDA0002579799880000081

in the conversion matrix B: a isx=f×dpu,ay=f×dpv(ii) a Where f is the focal length, dp, of the binocular camera 203uIs the physical length, dp, of one pixel of the binocular camera 203 in the X directionvIs the physical length, u, of one pixel in the Y direction of the binocular camera 2030Is the transverse coordinate, v, of the optical center of the binocular camera 203 in pixel coordinates0Is the longitudinal coordinate of the optical center of the binocular camera 203 in pixel coordinates, and gamma is the radial distortion parameter. These parameters are already obtained in step S301 of the preoperative preparation process. Thus, converting the pixel coordinates from the imaging coordinate system to the camera coordinate system can be expressed by the following formula:

wherein, x'i、y′iAnd z'iRespectively the X-axis coordinate, the Y-axis coordinate and the Z-axis coordinate of said one marker in the camera coordinate system, B-1Is the inverse of the transformation matrix between the camera coordinate system and the imaging coordinate system, ui、yiRespectively the lateral and longitudinal coordinates of said one marker in the imaging coordinate system, i representing from 1 to n, and n representing the number of said markers.

When the marker isWhen the pixel coordinates are converted from the imaging coordinate system to the camera coordinate system, the camera coordinates of the one marker in the camera coordinate system can be obtained The host 209 then proceeds to convert the camera coordinates from the camera coordinate system to the real coordinate system according to a calculation process.

Specifically, the conversion relationship a from the real coordinate system to the camera coordinate system is already obtained in step S403, and therefore, the conversion of the camera coordinates from the camera coordinate system to the real coordinate system can be represented by the following formula:

Figure BDA0002579799880000084

wherein x isi、yiAnd ziRespectively the X-axis coordinate, the Y-axis coordinate and the Z-axis coordinate of the marker in the real coordinate system, A-1Is the inverse of the transformation matrix between the real coordinate system to the camera coordinate system. Thus, the real world coordinates of a marker in the real time regional scene image can be obtained

Figure BDA0002579799880000085

Further, in step S503, for the virtual coordinate point set and the real coordinate point set of the marker that have been acquired, a quaternion method is used to determine a conversion relationship between the virtual coordinate and the real coordinate, wherein the correspondence relationship between each pair of the virtual coordinate point and the real coordinate point in the virtual coordinate point set and the real coordinate point set is required to be known.

Specifically, the set of virtual coordinate points and the set of real coordinate points may be represented by Q, respectivelynAnd PnTo indicate. Firstly, the barycentric coordinates of the two coordinate point sets need to be obtained, which can be respectively expressed by the following two formulas:

Figure BDA0002579799880000092

Wherein the content of the first and second substances,

Figure BDA0002579799880000093

barycentric coordinates representing a set of virtual coordinate points, usingRepresenting the barycentric coordinates of the set of real coordinate points. Then, the virtual coordinate points and the real coordinate points are processed by barycenter, which can be expressed by the following formula:

therefore, the virtual coordinate point set Q 'after the center of gravity is further obtained'nAnd set of real coordinate points P'n. Furthermore, the function is constructed according to the principle of least squares

Figure BDA0002579799880000096

Where r is the rotation matrix of the virtual world coordinate system to the real world coordinate system and t is the translation matrix of the virtual world coordinate system to the real world coordinate system. On the basis of the constructor, in combination with equation (6), the constructor can be expanded and simplified to obtain the following equation:

Figure BDA0002579799880000097

after obtaining the formula (7), performing quaternion algorithm solution, and first obtaining a covariance matrix of the formula (7), which can be expressed by the following formula:

wherein the content of the first and second substances,

Figure BDA0002579799880000099

Figure BDA00025797998800000910

Figure BDA00025797998800000911

and then, constructing a symmetric matrix W according to the covariance matrix (8), wherein the symmetric matrix W can be represented by the following formula:

the feature vector corresponding to the maximum feature value in the symmetric matrix W is obtained, and since the feature vector is equal to the unit quaternion, the quaternion matrix obtained can be expressed as I ═ I0I1I2I3]Finally, according to the relationship between the quaternion matrix and the rotation matrix, a rotation matrix r from the virtual world coordinate system to the real world coordinate system can be obtained as follows:

Figure BDA00025797998800000913

Further, the constructor is substituted in combination with the formula (10), the formula (5) and the formula (4)

Figure BDA0002579799880000101

Then, a translation matrix t from the virtual world coordinate system to the real world coordinate system can be obtained. Thus, the conversion relation between the virtual coordinate and the real coordinate of the marker is obtained

Further, in step S504, the conversion relationship between the virtual coordinate system and the real coordinate system is determinedVirtual coordinates of markersMultiplying, converting the two-dimensional section from a virtual coordinate system to a real coordinate system and obtaining the real coordinate of the virtual marker through conversion calculation

Further, in step S505, the conversion relationship between the real coordinate system to the camera coordinate system is performed

Figure BDA0002579799880000104

The real coordinates of the virtual marker are multiplied, the two-dimensional section can be converted from a real coordinate system to a camera coordinate system, and the camera coordinates of the virtual marker can be obtained through conversion calculation.

Further, in step S506, the conversion relationship between the camera coordinate system to the imaging coordinate system

Figure BDA0002579799880000105

The camera coordinates of the virtual marker are multiplied, the two-dimensional section can be converted from a camera coordinate system to an imaging coordinate system, and the pixel coordinates of the virtual marker can be obtained through conversion calculation. Therefore, the virtual-real registration process based on the augmented reality technology is completed.

In summary, according to the augmented reality-based visualization method and system for the implantation path of prostate particles provided by the present invention, when a doctor performs a brachytherapy operation on prostate, the doctor can use a projection device based on augmented reality technology to observe information such as organ tissues and the implantation path of particles on the coronal plane of the operated area of the patient, so as to supplement missing information, and ensure consistency of hands and eyes during observation, while the precision of virtual-real registration is ensured by using the scene images of the operated area obtained in real time during the operation and the quaternion algorithm, so as to improve the visualization effect of information such as organ tissues and the implantation path of particles on the coronal plane of the operated area of the patient, and finally avoid the disadvantage that the implantation process of particles excessively depends on the personal experience and imagination of the doctor, thereby reducing the workload of the doctor and improving the success rate of the operation.

The foregoing is a more detailed description of the invention in connection with specific embodiments thereof, and the specific embodiments thereof are not to be considered as limited by the foregoing description. For a person skilled in the art, several non-inventive variants or alterations without departing from the inventive concept should be considered as being within the scope of protection determined by the claims as filed.

22页详细技术资料下载
上一篇:一种医用注射器针头装配设备
下一篇:一种鼻整形仿真方法

网友询问留言

已有0条留言

还没有人留言评论。精彩留言会获得点赞!

精彩留言,会给你点赞!