Search results for: Diagnostic errors
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1987

Search results for: Diagnostic errors

1117 Velocity Logs Error Reduction for In-Service Calibration of Vessel Performance Indicators

Authors: Maria Tsompanoglou, Dimitris Armenis

Abstract:

Vessel behavior in different operational and weather conditions constitutes the main area of interest for the ship operator. Ship speed and fuel consumption are the most decisive parameters in this respect, as their correlation provides information about the economic and environmental efficiency of the vessel, becoming the basis of decision making in terms of maintenance and trading. In the analysis of vessel operational profile for the evaluation of fuel consumption and the equivalent CO2 emissions footprint, the indications of Speed Through Water are widely used. The seasonal and regional variations in seawater characteristics, which are available nowadays, can provide the basis for accurate estimation of the errors in Speed Through Water indications at any time. Accuracy in the speed value on a route basis can enable operator identify the ship fuel and propulsion efficiency and proceed with improvements. This paper discusses case studies, where the actual vessel speed was corrected by a post-processing algorithm. The effects of the vessel correction to standard Key Performance Indicators, as well as operational findings not identified earlier, are also discussed.

Keywords: data analytics, MATLAB, vessel performance monitoring, speed through water

Procedia PDF Downloads 297
1116 Scalable CI/CD and Scalable Automation: Assisting in Optimizing Productivity and Fostering Delivery Expansion

Authors: Solanki Ravirajsinh, Kudo Kuniaki, Sharma Ankit, Devi Sherine, Kuboshima Misaki, Tachi Shuntaro

Abstract:

In software development life cycles, the absence of scalable CI/CD significantly impacts organizations, leading to increased overall maintenance costs, prolonged release delivery times, heightened manual efforts, and difficulties in meeting tight deadlines. Implementing CI/CD with standard serverless technologies using cloud services overcomes all the above-mentioned issues and helps organizations improve efficiency and faster delivery without the need to manage server maintenance and capacity. By integrating scalable CI/CD with scalable automation testing, productivity, quality, and agility are enhanced while reducing the need for repetitive work and manual efforts. Implementing scalable CI/CD for development using cloud services like ECS (Container Management Service), AWS Fargate, ECR (to store Docker images with all dependencies), Serverless Computing (serverless virtual machines), Cloud Log (for monitoring errors and logs), Security Groups (for inside/outside access to the application), Docker Containerization (Docker-based images and container techniques), Jenkins (CI/CD build management tool), and code management tools (GitHub, Bitbucket, AWS CodeCommit) can efficiently handle the demands of diverse development environments and are capable of accommodating dynamic workloads, increasing efficiency for faster delivery with good quality. CI/CD pipelines encourage collaboration among development, operations, and quality assurance teams by providing a centralized platform for automated testing, deployment, and monitoring. Scalable CI/CD streamlines the development process by automatically fetching the latest code from the repository every time the process starts, building the application based on the branches, testing the application using a scalable automation testing framework, and deploying the builds. Developers can focus more on writing code and less on managing infrastructure as it scales based on the need. Serverless CI/CD eliminates the need to manage and maintain traditional CI/CD infrastructure, such as servers and build agents, reducing operational overhead and allowing teams to allocate resources more efficiently. Scalable CI/CD adjusts the application's scale according to usage, thereby alleviating concerns about scalability, maintenance costs, and resource needs. Creating scalable automation testing using cloud services (ECR, ECS Fargate, Docker, EFS, Serverless Computing) helps organizations run more than 500 test cases in parallel, aiding in the detection of race conditions, performance issues, and reducing execution time. Scalable CI/CD offers flexibility, dynamically adjusting to varying workloads and demands, allowing teams to scale resources up or down as needed. It optimizes costs by only paying for the resources as they are used and increases reliability. Scalable CI/CD pipelines employ automated testing and validation processes to detect and prevent errors early in the development cycle.

Keywords: achieve parallel execution, cloud services, scalable automation testing, scalable continuous integration and deployment

Procedia PDF Downloads 37
1115 A Deterministic Approach for Solving the Hull and White Interest Rate Model with Jump Process

Authors: Hong-Ming Chen

Abstract:

This work considers the resolution of the Hull and White interest rate model with the jump process. A deterministic process is adopted to model the random behavior of interest rate variation as deterministic perturbations, which is depending on the time t. The Brownian motion and jumps uncertainty are denoted as the integral functions piecewise constant function w(t) and point function θ(t). It shows that the interest rate function and the yield function of the Hull and White interest rate model with jump process can be obtained by solving a nonlinear semi-infinite programming problem. A relaxed cutting plane algorithm is then proposed for solving the resulting optimization problem. The method is calibrated for the U.S. treasury securities at 3-month data and is used to analyze several effects on interest rate prices, including interest rate variability, and the negative correlation between stock returns and interest rates. The numerical results illustrate that our approach essentially generates the yield functions with minimal fitting errors and small oscillation.

Keywords: optimization, interest rate model, jump process, deterministic

Procedia PDF Downloads 159
1114 Fault-Detection and Self-Stabilization Protocol for Wireless Sensor Networks

Authors: Ather Saeed, Arif Khan, Jeffrey Gosper

Abstract:

Sensor devices are prone to errors and sudden node failures, which are difficult to detect in a timely manner when deployed in real-time, hazardous, large-scale harsh environments and in medical emergencies. Therefore, the loss of data can be life-threatening when the sensed phenomenon is not disseminated due to sudden node failure, battery depletion or temporary malfunctioning. We introduce a set of partial differential equations for localizing faults, similar to Green’s and Maxwell’s equations used in Electrostatics and Electromagnetism. We introduce a node organization and clustering scheme for self-stabilizing sensor networks. Green’s theorem is applied to regions where the curve is closed and continuously differentiable to ensure network connectivity. Experimental results show that the proposed GTFD (Green’s Theorem fault-detection and Self-stabilization) protocol not only detects faulty nodes but also accurately generates network stability graphs where urgent intervention is required for dynamically self-stabilizing the network.

Keywords: Green’s Theorem, self-stabilization, fault-localization, RSSI, WSN, clustering

Procedia PDF Downloads 71
1113 Incomplete Existing Algebra to Support Mathematical Computations

Authors: Ranjit Biswas

Abstract:

The existing subject Algebra is incomplete to support mathematical computations being done by scientists of all areas: Mathematics, Physics, Statistics, Chemistry, Space Science, Cosmology etc. even starting from the era of great Einstein. A huge hidden gap in the subject ‘Algebra’ is unearthed. All the scientists today, including mathematicians, physicists, chemists, statisticians, cosmologists, space scientists, and economists, even starting from the great Einstein, are lucky that they got results without facing any contradictions or without facing computational errors. Most surprising is that the results of all scientists, including Nobel Prize winners, were proved by them by doing experiments too. But in this paper, it is rigorously justified that they all are lucky. An algebraist can define an infinite number of new algebraic structures. The objective of the work in this paper is not just for the sake of defining a distinct algebraic structure, but to recognize and identify a major gap of the subject ‘Algebra’ lying hidden so far in the existing vast literature of it. The objective of this work is to fix the unearthed gap. Consequently, a different algebraic structure called ‘Region’ has been introduced, and its properties are studied.

Keywords: region, ROR, RORR, region algebra

Procedia PDF Downloads 47
1112 Determination of Critical Organ Doses for Liver Scintigraphy Using Cr-51

Authors: O. Maranci, A. B. Tugrul

Abstract:

Scintigraphy is an imaging method of nuclear events provoked by collisions or charged current interactions with radiation. It is used for diagnostic test used in nuclear medicine via radiopharmaceuticals emitting radiation which is captured by gamma cameras to form two-dimensional images. Liver scintigraphy is widely used in nuclear medicine.Tc-99m and Cr-51 gamma radioisotopes can be used for this purpose. Cr-51 usage is more important for patients’ organ dose that has higher energy and longer half-life as compared to Tc-99m. In this study, it is aimed to determine the required dose for critical organs of patient through liver scintigraphy via Cr-51 gamma radioisotope. Experimental studies were conducted on patients even though conducting experimental studies on patients is extremely difficult for determination of critical organ doses. Torso phantom was utilized to simulate the liver scintigraphy by using 20 mini packages of Cr-51 that were placed on the organ. The radioisotope was produced by irradiation in central thimble of TRIGA MARK II Reactor at 250 KW power. As the results of the study, critical organ doses were determined and evaluated with different critic organs.

Keywords: critical organ doses, liver, scintigraphy, TRIGA Mark-II

Procedia PDF Downloads 551
1111 A Peg Board with Photo-Reflectors to Detect Peg Insertion and Pull-Out Moments

Authors: Hiroshi Kinoshita, Yasuto Nakanishi, Ryuhei Okuno, Toshio Higashi

Abstract:

Various kinds of pegboards have been developed and used widely in research and clinics of rehabilitation for evaluation and training of patient’s hand function. A common measure in these peg boards is a total time of performance execution assessed by a tester’s stopwatch. Introduction of electrical and automatic measurement technology to the apparatus, on the other hand, has been delayed. The present work introduces the development of a pegboard with an electric sensor to detect moments of individual peg’s insertion and removal. The work also gives fundamental data obtained from a group of healthy young individuals who performed peg transfer tasks using the pegboard developed. Through trails and errors in pilot tests, two 10-hole peg-board boxes installed with a small photo-reflector and a DC amplifier at the bottom of each hole were designed and built by the present authors. The amplified electric analogue signals from the 20 reflectors were automatically digitized at 500 Hz per channel, and stored in a PC. The boxes were set on a test table at different distances (25, 50, 75, and 125 mm) in parallel to examine the effect of hole-to-hole distance. Fifty healthy young volunteers (25 in each gender) as subjects of the study performed successive fast 80 time peg transfers at each distance using their dominant and non-dominant hands. The data gathered showed a clear-cut light interruption/continuation moment by the pegs, allowing accurately (no tester’s error involved) and precisely (an order of milliseconds) to determine the pull out and insertion times of each peg. This further permitted computation of individual peg movement duration (PMD: from peg-lift-off to insertion) apart from hand reaching duration (HRD: from peg insertion to lift-off). An accidental drop of a peg led to an exceptionally long ( < mean + 3 SD) PMD, which was readily detected from an examination of data distribution. The PMD data were commonly right-skewed, suggesting that the median can be a better estimate of individual PMD than the mean. Repeated measures ANOVA using the median values revealed significant hole-to-hole distance, and hand dominance effects, suggesting that these need to be fixed in the accurate evaluation of PMD. The gender effect was non-significant. Performance consistency was also evaluated by the use of quartile variation coefficient values, which revealed no gender, hole-to-hole, and hand dominance effects. The measurement reliability was further examined using interclass correlation obtained from 14 subjects who performed the 25 and 125 mm hole distance tasks at two 7-10 days separate test sessions. Inter-class correlation values between the two tests showed fair reliability for PMD (0.65-0.75), and for HRD (0.77-0.94). We concluded that a sensor peg board developed in the present study could provide accurate (excluding tester’s errors), and precise (at a millisecond rate) time information of peg movement separated from that used for hand movement. It could also easily detect and automatically exclude erroneous execution data from his/her standard data. These would lead to a better evaluation of hand dexterity function compared to the widely used conventional used peg boards.

Keywords: hand, dexterity test, peg movement time, performance consistency

Procedia PDF Downloads 131
1110 Variations of the Modal Characteristics of the Feeding Stage with Different Preloaded Linear Guide

Authors: Jui-Pui Hung, Yong-Run Chen, Wei-Cheng Shih, Chun-Wei Lin

Abstract:

This study was aimed to assess the variations of the modal characteristics of the feeding stage with different linear guide modulus. The dynamic characteristics of the feeding stage were characterized in terms of the modal stiffness, modal frequency and modal damping, which are assessed from the vibration tests. According to the experimental measurements, the actual preload of the linear guide modulus was found to deviate from the rated values as setting in factory. This may be due to the assemblage errors of guide modules. For the stage with linear guides, the dynamic stiffness was affected to change by the preload set on the rolling balls. The variation of the dynamic stiffness at first and second modes is 20.8 and 10.5%, respectively when the linear guide preload is adjusted from medium and high amount. But the modal damping ratio is reduced by 8.97 and 9.65%, respectively. For high-frequency mode, the modal stiffness increases by 171.2% and the damping ratio reduced by 34.4%. Current results demonstrate the importance in the determining the preloaded amount of linear guide modulus in practical application.

Keywords: contact stiffness, feeding stage, linear guides, modal characteristics, pre-load

Procedia PDF Downloads 423
1109 Quantum Dot Biosensing for Advancing Precision Cancer Detection

Authors: Sourav Sarkar, Manashjit Gogoi

Abstract:

In the evolving landscape of cancer diagnostics, optical biosensing has emerged as a promising tool due to its sensitivity and specificity. This study explores the potential of CdS/ZnS core-shell quantum dots (QDs) capped with 3-Mercaptopropionic acid (3-MPA), which aids in the linking chemistry of QDs to various cancer antibodies. The QDs, with their unique optical and electronic properties, have been integrated into the biosensor design. Their high quantum yield and size-dependent emission spectra have been exploited to improve the sensor’s detection capabilities. The study presents the design of this QD-enhanced optical biosensor. The use of these QDs can also aid multiplexed detection, enabling simultaneous monitoring of different cancer biomarkers. This innovative approach holds significant potential for advancing cancer diagnostics, contributing to timely and accurate detection. Future work will focus on optimizing the biosensor design for clinical applications and exploring the potential of QDs in other biosensing applications. This study underscores the potential of integrating nanotechnology and biosensing for cancer research, paving the way for next-generation diagnostic tools. It is a step forward in our quest for achieving precision oncology.

Keywords: quantum dots, biosensing, cancer, device

Procedia PDF Downloads 53
1108 Mineralized Nanoparticles as a Contrast Agent for Ultrasound and Magnetic Resonance Imaging

Authors: Jae Won Lee, Kyung Hyun Min, Hong Jae Lee, Sang Cheon Lee

Abstract:

To date, imaging techniques have attracted much attention in medicine because the detection of diseases at an early stage provides greater opportunities for successful treatment. Consequently, over the past few decades, diverse imaging modalities including magnetic resonance (MR), positron emission tomography, computed tomography, and ultrasound (US) have been developed and applied widely in the field of clinical diagnosis. However, each of the above-mentioned imaging modalities possesses unique strengths and intrinsic weaknesses, which limit their abilities to provide accurate information. Therefore, multimodal imaging systems may be a solution that can provide improved diagnostic performance. Among the current medical imaging modalities, US is a widely available real-time imaging modality. It has many advantages including safety, low cost and easy access for patients. However, its low spatial resolution precludes accurate discrimination of diseased region such as cancer sites. In contrast, MR has no tissue-penetrating limit and can provide images possessing exquisite soft tissue contrast and high spatial resolution. However, it cannot offer real-time images and needs a comparatively long imaging time. The characteristics of these imaging modalities may be considered complementary, and the modalities have been frequently combined for the clinical diagnostic process. Biominerals such as calcium carbonate (CaCO3) and calcium phosphate (CaP) exhibit pH-dependent dissolution behavior. They demonstrate pH-controlled drug release due to the dissolution of minerals in acidic pH conditions. In particular, the application of this mineralization technique to a US contrast agent has been reported recently. The CaCO3 mineral reacts with acids and decomposes to generate calcium dioxide (CO2) gas in an acidic environment. These gas-generating mineralized nanoparticles generated CO2 bubbles in the acidic environment of the tumor, thereby allowing for strong echogenic US imaging of tumor tissues. On the basis of this previous work, it was hypothesized that the loading of MR contrast agents into the CaCO3 mineralized nanoparticles may be a novel strategy in designing a contrast agent for dual imaging. Herein, CaCO3 mineralized nanoparticles that were capable of generating CO2 bubbles to trigger the release of entrapped MR contrast agents in response to tumoral acidic pH were developed for the purposes of US and MR dual-modality imaging of tumors. Gd2O3 nanoparticles were selected as an MR contrast agent. A key strategy employed in this study was to prepare Gd2O3 nanoparticle-loaded mineralized nanoparticles (Gd2O3-MNPs) using block copolymer-templated CaCO3 mineralization in the presence of calcium cations (Ca2+), carbonate anions (CO32-) and positively charged Gd2O3 nanoparticles. The CaCO3 core was considered suitable because it may effectively shield Gd2O3 nanoparticles from water molecules in the blood (pH 7.4) before decomposing to generate CO2 gas, triggering the release of Gd2O3 nanoparticles in tumor tissues (pH 6.4~7.4). The kinetics of CaCO3 dissolution and CO2 generation from the Gd2O3-MNPs were examined as a function of pH and pH-dependent in vitro magnetic relaxation; additionally, the echogenic properties were estimated to demonstrate the potential of the particles for the tumor-specific US and MR imaging.

Keywords: calcium carbonate, mineralization, ultrasound imaging, magnetic resonance imaging

Procedia PDF Downloads 233
1107 Enhance Security in XML Databases: XLog File for Severity-Aware Trust-Based Access Control

Authors: A: Asmawi, L. S. Affendey, N. I. Udzir, R. Mahmod

Abstract:

The topic of enhancing security in XML databases is important as it includes protecting sensitive data and providing a secure environment to users. In order to improve security and provide dynamic access control for XML databases, we presented XLog file to calculate user trust values by recording users’ bad transaction, errors and query severities. Severity-aware trust-based access control for XML databases manages the access policy depending on users' trust values and prevents unauthorized processes, malicious transactions and insider threats. Privileges are automatically modified and adjusted over time depending on user behaviour and query severity. Logging in database is an important process and is used for recovery and security purposes. In this paper, the Xlog file is presented as a dynamic and temporary log file for XML databases to enhance the level of security.

Keywords: XML database, trust-based access control, severity-aware, trust values, log file

Procedia PDF Downloads 296
1106 A Comparison of Bias Among Relaxed Divisor Methods Using 3 Bias Measurements

Authors: Sumachaya Harnsukworapanich, Tetsuo Ichimori

Abstract:

The apportionment method is used by many countries, to calculate the distribution of seats in political bodies. For example, this method is used in the United States (U.S.) to distribute house seats proportionally based on the population of the electoral district. Famous apportionment methods include the divisor methods called the Adams Method, Dean Method, Hill Method, Jefferson Method and Webster Method. Sometimes the results from the implementation of these divisor methods are unfair and include errors. Therefore, it is important to examine the optimization of this method by using a bias measurement to figure out precise and fair results. In this research we investigate the bias of divisor methods in the U.S. Houses of Representatives toward large and small states by applying the Stolarsky Mean Method. We compare the bias of the apportionment method by using two famous bias measurements: The Balinski and Young measurement and the Ernst measurement. Both measurements have a formula for large and small states. The Third measurement however, which was created by the researchers, did not factor in the element of large and small states into the formula. All three measurements are compared and the results show that our measurement produces similar results to the other two famous measurements.

Keywords: apportionment, bias, divisor, fair, measurement

Procedia PDF Downloads 363
1105 Using Optimal Control Method to Investigate the Stability and Transparency of a Nonlinear Teleoperation System with Time Varying Delay

Authors: Abasali Amini, Alireza Mirbagheri, Amir Homayoun Jafari

Abstract:

In this paper, a new structure for teleoperation systems with time varying delay has been modeled and proposed. A random time varying the delay of up to 150 msec is simulated in teleoperation channel of both masters to slave and vice versa. The system stability and transparency have been investigated, comparing the result of a PID controller and an optimal controller on each master and slave sub-systems separately. The controllers have been designed in slave subsystem for reducing position errors between master and slave, and another controller has been designed in the master subsystem to establish stability, transparency and force tracking. Results have been compared together. The results showed PID controller is appropriate in position tracking, but force response oscillates in contact with the environment. We showed the optimal control established position tracking properly. Also, force tracking is achieved in this controller appropriately.

Keywords: optimal control, time varying delay, teleoperation systems, stability and transparency

Procedia PDF Downloads 251
1104 A Hybrid Digital Watermarking Scheme

Authors: Nazish Saleem Abbas, Muhammad Haris Jamil, Hamid Sharif

Abstract:

Digital watermarking is a technique that allows an individual to add and hide secret information, copyright notice, or other verification message inside a digital audio, video, or image. Today, with the advancement of technology, modern healthcare systems manage patients’ diagnostic information in a digital way in many countries. When transmitted between hospitals through the internet, the medical data becomes vulnerable to attacks and requires security and confidentiality. Digital watermarking techniques are used in order to ensure the authenticity, security and management of medical images and related information. This paper proposes a watermarking technique that embeds a watermark in medical images imperceptibly and securely. In this work, digital watermarking on medical images is carried out using the Least Significant Bit (LSB) with the Discrete Cosine Transform (DCT). The proposed methods of embedding and extraction of a watermark in a watermarked image are performed in the frequency domain using LSB by XOR operation. The quality of the watermarked medical image is measured by the Peak signal-to-noise ratio (PSNR). It was observed that the watermarked medical image obtained performing XOR operation between DCT and LSB survived compression attack having a PSNR up to 38.98.

Keywords: watermarking, image processing, DCT, LSB, PSNR

Procedia PDF Downloads 44
1103 An Image Segmentation Algorithm for Gradient Target Based on Mean-Shift and Dictionary Learning

Authors: Yanwen Li, Shuguo Xie

Abstract:

In electromagnetic imaging, because of the diffraction limited system, the pixel values could change slowly near the edge of the image targets and they also change with the location in the same target. Using traditional digital image segmentation methods to segment electromagnetic gradient images could result in lots of errors because of this change in pixel values. To address this issue, this paper proposes a novel image segmentation and extraction algorithm based on Mean-Shift and dictionary learning. Firstly, the preliminary segmentation results from adaptive bandwidth Mean-Shift algorithm are expanded, merged and extracted. Then the overlap rate of the extracted image block is detected before determining a segmentation region with a single complete target. Last, the gradient edge of the extracted targets is recovered and reconstructed by using a dictionary-learning algorithm, while the final segmentation results are obtained which are very close to the gradient target in the original image. Both the experimental results and the simulated results show that the segmentation results are very accurate. The Dice coefficients are improved by 70% to 80% compared with the Mean-Shift only method.

Keywords: gradient image, segmentation and extract, mean-shift algorithm, dictionary iearning

Procedia PDF Downloads 260
1102 Inadequate Requirements Engineering Process: A Key Factor for Poor Software Development in Developing Nations: A Case Study

Authors: K. Adu Michael, K. Alese Boniface

Abstract:

Developing a reliable and sustainable software products is today a big challenge among up–coming software developers in Nigeria. The inability to develop a comprehensive problem statement needed to execute proper requirements engineering process is missing. The need to describe the ‘what’ of a system in one document, written in a natural language is a major step in the overall process of Software Engineering. Requirements Engineering is a process use to discover, analyze and validate system requirements. This process is needed in reducing software errors at the early stage of the development of software. The importance of each of the steps in Requirements Engineering is clearly explained in the context of using detailed problem statement from client/customer to get an overview of an existing system along with expectations from the new system. This paper elicits inadequate Requirements Engineering principle as the major cause of poor software development in developing nations using a case study of final year computer science students of a tertiary-education institution in Nigeria.

Keywords: client/customer, problem statement, requirements engineering, software developers

Procedia PDF Downloads 400
1101 Application Methodology for the Generation of 3D Thermal Models Using UAV Photogrammety and Dual Sensors for Mining/Industrial Facilities Inspection

Authors: Javier Sedano-Cibrián, Julio Manuel de Luis-Ruiz, Rubén Pérez-Álvarez, Raúl Pereda-García, Beatriz Malagón-Picón

Abstract:

Structural inspection activities are necessary to ensure the correct functioning of infrastructures. Unmanned Aerial Vehicle (UAV) techniques have become more popular than traditional techniques. Specifically, UAV Photogrammetry allows time and cost savings. The development of this technology has permitted the use of low-cost thermal sensors in UAVs. The representation of 3D thermal models with this type of equipment is in continuous evolution. The direct processing of thermal images usually leads to errors and inaccurate results. A methodology is proposed for the generation of 3D thermal models using dual sensors, which involves the application of visible Red-Blue-Green (RGB) and thermal images in parallel. Hence, the RGB images are used as the basis for the generation of the model geometry, and the thermal images are the source of the surface temperature information that is projected onto the model. Mining/industrial facilities representations that are obtained can be used for inspection activities.

Keywords: aerial thermography, data processing, drone, low-cost, point cloud

Procedia PDF Downloads 139
1100 Statistical Classification, Downscaling and Uncertainty Assessment for Global Climate Model Outputs

Authors: Queen Suraajini Rajendran, Sai Hung Cheung

Abstract:

Statistical down scaling models are required to connect the global climate model outputs and the local weather variables for climate change impact prediction. For reliable climate change impact studies, the uncertainty associated with the model including natural variability, uncertainty in the climate model(s), down scaling model, model inadequacy and in the predicted results should be quantified appropriately. In this work, a new approach is developed by the authors for statistical classification, statistical down scaling and uncertainty assessment and is applied to Singapore rainfall. It is a robust Bayesian uncertainty analysis methodology and tools based on coupling dependent modeling error with classification and statistical down scaling models in a way that the dependency among modeling errors will impact the results of both classification and statistical down scaling model calibration and uncertainty analysis for future prediction. Singapore data are considered here and the uncertainty and prediction results are obtained. From the results obtained, directions of research for improvement are briefly presented.

Keywords: statistical downscaling, global climate model, climate change, uncertainty

Procedia PDF Downloads 365
1099 Assisted Video Colorization Using Texture Descriptors

Authors: Andre Peres Ramos, Franklin Cesar Flores

Abstract:

Colorization is the process of add colors to a monochromatic image or video. Usually, the process involves to segment the image in regions of interest and then apply colors to each one, for videos, this process is repeated for each frame, which makes it a tedious and time-consuming job. We propose a new assisted method for video colorization; the user only has to colorize one frame, and then the colors are propagated to following frames. The user can intervene at any time to correct eventual errors in color assignment. The method consists of to extract intensity and texture descriptors from the frames and then perform a feature matching to determine the best color for each segment. To reduce computation time and give a better spatial coherence we narrow the area of search and give weights for each feature to emphasize texture descriptors. To give a more natural result, we use an optimization algorithm to make the color propagation. Experimental results in several image sequences, compared to others existing methods, demonstrates that the proposed method perform a better colorization with less time and user interference.

Keywords: colorization, feature matching, texture descriptors, video segmentation

Procedia PDF Downloads 159
1098 Fault Diagnosis of Nonlinear Systems Using Dynamic Neural Networks

Authors: E. Sobhani-Tehrani, K. Khorasani, N. Meskin

Abstract:

This paper presents a novel integrated hybrid approach for fault diagnosis (FD) of nonlinear systems. Unlike most FD techniques, the proposed solution simultaneously accomplishes fault detection, isolation, and identification (FDII) within a unified diagnostic module. At the core of this solution is a bank of adaptive neural parameter estimators (NPE) associated with a set of single-parameter fault models. The NPEs continuously estimate unknown fault parameters (FP) that are indicators of faults in the system. Two NPE structures including series-parallel and parallel are developed with their exclusive set of desirable attributes. The parallel scheme is extremely robust to measurement noise and possesses a simpler, yet more solid, fault isolation logic. On the contrary, the series-parallel scheme displays short FD delays and is robust to closed-loop system transients due to changes in control commands. Finally, a fault tolerant observer (FTO) is designed to extend the capability of the NPEs to systems with partial-state measurement.

Keywords: hybrid fault diagnosis, dynamic neural networks, nonlinear systems, fault tolerant observer

Procedia PDF Downloads 389
1097 Diffraction-Based Immunosensor for Dengue NS1 Virus

Authors: Harriet Jane R. Caleja, Joel I. Ballesteros, Florian R. Del Mundo

Abstract:

The dengue fever belongs to the world’s major cause of death, especially in the tropical areas. In the Philippines, the number of dengue cases during the first half of 2015 amounted to more than 50,000. In 2012, the total number of cases of dengue infection reached 132,046 of which 701 patients died. Dengue Nonstructural 1 virus (Dengue NS1 virus) is a recently discovered biomarker for the early detection of dengue virus. It is present in the serum of the dengue virus infected patients even during the earliest stages prior to the formation of dengue virus antibodies. A biosensor for the dengue detection using NS1 virus was developed for faster and accurate diagnostic tool. Biotinylated anti-dengue virus NS1 was used as the receptor for dengue virus NS1. Using the Diffractive Optics Technology (dotTM) technique, real time binding of the NS1 virus to the biotinylated anti-NS1 antibody is observed. The dot®-Avidin sensor recognizes the biotinylated anti-NS1 and this served as the capture molecule to the analyte, NS1 virus. The increase in the signal of the diffractive intensity signifies the binding of the capture and the analyte. The LOD was found to be 3.87 ng/mL while the LOQ is 12.9 ng/mL. The developed biosensor was also found to be specific for the NS1 virus.

Keywords: avidin-biotin, diffractive optics technology, immunosensor, NS1

Procedia PDF Downloads 322
1096 Optimization of Surface Coating on Magnetic Nanoparticles for Biomedical Applications

Authors: Xiao-Li Liu, Ling-Yun Zhao, Xing-Jie Liang, Hai-Ming Fan

Abstract:

Owing to their unique properties, magnetic nanoparticles have been used as diagnostic and therapeutic agents for biomedical applications. Highly monodispersed magnetic nanoparticles with controlled particle size and surface coating have been successfully synthesized as a model system to investigate the effect of surface coating on the T2 relaxivity and specific absorption rate (SAR) under an alternating magnetic field, respectively. Amongst, by using mPEG-g-PEI to solubilize oleic-acid capped 6 nm magnetic nanoparticles, the T2 relaxivity could be significantly increased by up to 4-fold as compared to PEG coated nanoparticles. Moreover, it largely enhances the cell uptake with a T2 relaxivity of 92.6 mM-1s-1 for in vitro cell MRI. As for hyperthermia agent, SAR value increase with the decreased thickness of PEG surface coating. By elaborate optimization of surface coating and particle size, a significant increase of SAR (up to 74%) could be achieved with a minimal variation on the saturation magnetization (<5%). The 19 nm magnetic nanoparticles with 2000 Da PEG exhibited the highest SAR of 930 W•g-1 among the samples, which can be maintained in various simulated physiological conditions. This systematic work provides a general strategy for the optimization of surface coating of magnetic core for high performance MRI contrast agent and hyperthermia agent.

Keywords: magnetic nanoparticles, magnetic hyperthermia, magnetic resonance imaging, surface modification

Procedia PDF Downloads 506
1095 Improvement of Brain Tumors Detection Using Markers and Boundaries Transform

Authors: Yousif Mohamed Y. Abdallah, Mommen A. Alkhir, Amel S. Algaddal

Abstract:

This was experimental study conducted to study segmentation of brain in MRI images using edge detection and morphology filters. For brain MRI images each film scanned using digitizer scanner then treated by using image processing program (MatLab), where the segmentation was studied. The scanned image was saved in a TIFF file format to preserve the quality of the image. Brain tissue can be easily detected in MRI image if the object has sufficient contrast from the background. We use edge detection and basic morphology tools to detect a brain. The segmentation of MRI images steps using detection and morphology filters were image reading, detection entire brain, dilation of the image, filling interior gaps inside the image, removal connected objects on borders and smoothen the object (brain). The results of this study were that it showed an alternate method for displaying the segmented object would be to place an outline around the segmented brain. Those filters approaches can help in removal of unwanted background information and increase diagnostic information of Brain MRI.

Keywords: improvement, brain, matlab, markers, boundaries

Procedia PDF Downloads 511
1094 Based on MR Spectroscopy, Metabolite Ratio Analysis of MRI Images for Metastatic Lesion

Authors: Hossain A, Hossain S.

Abstract:

Introduction: In a small cohort, we sought to assess the magnetic resonance spectroscopy's (MRS) ability to predict the presence of metastatic lesions. Method: A Popular Diagnostic Centre Limited enrolled patients with neuroepithelial tumors. The 1H CSI MRS of the brain allows us to detect changes in the concentration of specific metabolites caused by metastatic lesions. Among these metabolites are N-acetyl-aspartate (NNA), creatine (Cr), and choline (Cho). For Cho, NAA, Cr, and Cr₂, the metabolic ratio was calculated using the division method. Results: The NAA values were 0.63 and 5.65 for tumor cells, 1.86 and 5.66 for normal cells, and 1.86 and 5.66 for normal cells 2. NAA values for normal cells 1 were 1.84, 10.6, and 1.86 for normal cells 2, respectively. Cho levels were as low as 0.8 and 10.53 in the tumor cell, compared to 1.12 and 2.7 in the normal cell 1 and 1.24 and 6.36 in the normal cell 2. Cho/Cr₂ barely distinguished itself from the other ratios in terms of significance. For tumor cells, the ratios of Cho/NAA, Cho/Cr₂, NAA/Cho, and NAA/Cr₂ were significant. Normal cell 1 had significant Cho/NAA, Cho/Cr, NAA/Cho, and NAA/Cr ratios. Conclusion: The clinical result can be improved by using 1H-MRSI to guide the size of resection for metastatic lesions. Even though it is non-invasive and doesn't present any difficulties during the procedure, MRS has been shown to predict the detection of metastatic lesions.

Keywords: metabolite ratio, MRI images, metastatic lesion, MR spectroscopy, N-acetyl-aspartate

Procedia PDF Downloads 91
1093 Longitudinal Analysis of Internet Speed Data in the Gulf Cooperation Council Region

Authors: Musab Isah

Abstract:

This paper presents a longitudinal analysis of Internet speed data in the Gulf Cooperation Council (GCC) region, focusing on the most populous cities of each of the six countries – Riyadh, Saudi Arabia; Dubai, UAE; Kuwait City, Kuwait; Doha, Qatar; Manama, Bahrain; and Muscat, Oman. The study utilizes data collected from the Measurement Lab (M-Lab) infrastructure over a five-year period from January 1, 2019, to December 31, 2023. The analysis includes downstream and upstream throughput data for the cities, covering significant events such as the launch of 5G networks in 2019, COVID-19-induced lockdowns in 2020 and 2021, and the subsequent recovery period and return to normalcy. The results showcase substantial increases in Internet speeds across the cities, highlighting improvements in both download and upload throughput over the years. All the GCC countries have achieved above-average Internet speeds that can conveniently support various online activities and applications with excellent user experience.

Keywords: internet data science, internet performance measurement, throughput analysis, internet speed, measurement lab, network diagnostic tool

Procedia PDF Downloads 55
1092 Microarrays: Wide Clinical Utilities and Advances in Healthcare

Authors: Salma M. Wakil

Abstract:

Advances in the field of genetics overwhelmed detecting large number of inherited disorders at the molecular level and directed to the development of innovative technologies. These innovations have led to gene sequencing, prenatal mutation detection, pre-implantation genetic diagnosis; population based carrier screening and genome wide analyses using microarrays. Microarrays are widely used in establishing clinical and diagnostic setup for genetic anomalies at a massive level, with the advent of cytoscan molecular karyotyping as a clinical utility card for detecting chromosomal aberrations with high coverage across the entire human genome. Unlike a regular karyotype that relies on the microscopic inspection of chromosomes, molecular karyotyping with cytoscan constructs virtual chromosomes based on the copy number analysis of DNA which improves its resolution by 100-fold. We have been investigating a large number of patients with Developmental Delay and Intellectual disability with this platform for establishing micro syndrome deletions and have detected number of novel CNV’s in the Arabian population with the clinical relevance.

Keywords: microarrays, molecular karyotyping, developmental delay, genetics

Procedia PDF Downloads 451
1091 Measurement of Convective Heat Transfer from a Vertical Flat Plate Using Mach-Zehnder Interferometer with Wedge Fringe Setting

Authors: Divya Haridas, C. B. Sobhan

Abstract:

Laser interferometric methods have been utilized for the measurement of natural convection heat transfer from a heated vertical flat plate, in the investigation presented here. The study mainly aims at comparing two different fringe orientations in the wedge fringe setting of Mach-Zehnder interferometer (MZI), used for the measurements. The interference fringes are set in horizontal and vertical orientations with respect to the heated surface, and two different fringe analysis methods, namely the stepping method and the method proposed by Naylor and Duarte, are used to obtain the heat transfer coefficients. The experimental system is benchmarked with theoretical results, thus validating its reliability in heat transfer measurements. The interference fringe patterns are analyzed digitally using MATLAB 7 and MOTIC Plus softwares, which ensure improved efficiency in fringe analysis, hence reducing the errors associated with conventional fringe tracing. The work also discuss the relative merits and limitations of the two methods used.

Keywords: Mach-Zehnder interferometer (MZI), natural convection, Naylor method, Vertical Flat Plate

Procedia PDF Downloads 362
1090 Presenting a Model for Predicting the State of Being Accident-Prone of Passages According to Neural Network and Spatial Data Analysis

Authors: Hamd Rezaeifar, Hamid Reza Sahriari

Abstract:

Accidents are considered to be one of the challenges of modern life. Due to the fact that the victims of this problem and also internal transportations are getting increased day by day in Iran, studying effective factors of accidents and identifying suitable models and parameters about this issue are absolutely essential. The main purpose of this research has been studying the factors and spatial data affecting accidents of Mashhad during 2007- 2008. In this paper it has been attempted to – through matching spatial layers on each other and finally by elaborating them with the place of accident – at the first step by adding landmarks of the accident and through adding especial fields regarding the existence or non-existence of effective phenomenon on accident, existing information banks of the accidents be completed and in the next step by means of data mining tools and analyzing by neural network, the relationship between these data be evaluated and a logical model be designed for predicting accident-prone spots with minimum error. The model of this article has a very accurate prediction in low-accident spots; yet it has more errors in accident-prone regions due to lack of primary data.

Keywords: accident, data mining, neural network, GIS

Procedia PDF Downloads 44
1089 Evaluation of the Diagnostic Potential of IL-2 as Biomarker for the Discrimination of Active and Latent Tuberculosis

Authors: Shima Mahmoudi, Setareh Mamishi, Babak Pourakbari, Majid Marjani

Abstract:

In the last years, the potential role of distinct T-cell subsets as biomarkers of active tuberculosis TB and/or latent tuberculosis infection (LTBI) has been studied. The aim of this study was to investigate the potential role of interleukin-2 (IL-2) in whole blood stimulated with M. tuberculosis-specific antigens in the QuantiFERON-TB Gold In Tube (QFT-G-IT) for the discrimination of active and latent tuberculosis. After 72-h of stimulation by antigens from the QFT-G-IT assay, IL-2 secretion was quantitated in supernatants by using ELISA (Mabtech AB, Sweden). Observing the level of IL-2 released after 72-h of incubation, we found that the level of IL-2 were significantly higher in LTBI group than in patients with active TB infection or control group (P value=0.019, Kruskal–Wallis test). The discrimination performance (assessed by the area under ROC curve) between LTBI and patients with active TB was 0.816 (95%CI: 0.72-0.97). Maximum discrimination was reached at a cut-off of 13.9 pg/mL for IL-2 following stimulation with 82% sensitivity and 86% specificity. In conclusion, although cytokine analysis has greatly contributed to the understanding of TB pathogenesis, data on cytokine profiles that might distinguish progression from latency of TB infection are scarce and even controversial. Our data indicate that the concomitant evaluation of IFN- γ and IL-2 could be instrumental in discriminating of active and latent TB infection.

Keywords: interleukin-2, discrimination, active TB, latent TB

Procedia PDF Downloads 405
1088 The Use of Modern Technologies and Computers in the Archaeological Surveys of Sistan in Eastern Iran

Authors: Mahyar MehrAfarin

Abstract:

The Sistan region in eastern Iran is a significant archaeological area in Iran and the Middle East, encompassing 10,000 square kilometers. Previous archeological field surveys have identified 1662 ancient sites dating from prehistoric periods to the Islamic period. Research Aim: This article aims to explore the utilization of modern technologies and computers in archaeological field surveys in Sistan, Iran, and the benefits derived from their implementation. Methodology: The research employs a descriptive-analytical approach combined with field methods. New technologies and software, such as GPS, drones, magnetometers, equipped cameras, satellite images, and software programs like GIS, Map source, and Excel, were utilized to collect information and analyze data. Findings: The use of modern technologies and computers in archaeological field surveys proved to be essential. Traditional archaeological activities, such as excavation and field surveys, are time-consuming and costly. Employing modern technologies helps in preserving ancient sites, accurately recording archaeological data, reducing errors and mistakes, and facilitating correct and accurate analysis. Creating a comprehensive and accessible database, generating statistics, and producing graphic designs and diagrams are additional advantages derived from the use of efficient technologies in archaeology. Theoretical Importance: The integration of computers and modern technologies in archaeology contributes to interdisciplinary collaborations and facilitates the involvement of specialists from various fields, such as geography, history, art history, anthropology, laboratory sciences, and computer engineering. The utilization of computers in archaeology spanned across diverse areas, including database creation, statistical analysis, graphics implementation, laboratory and engineering applications, and even artificial intelligence, which remains an unexplored area in Iranian archaeology. Data Collection and Analysis Procedures: Information was collected using modern technologies and software, capturing geographic coordinates, aerial images, archeogeophysical data, and satellite images. This data was then inputted into various software programs for analysis, including GIS, Map source, and Excel. The research employed both descriptive and analytical methods to present findings effectively. Question Addressed: The primary question addressed in this research is how the use of modern technologies and computers in archeological field surveys in Sistan, Iran, can enhance archaeological data collection, preservation, analysis, and accessibility. Conclusion: The utilization of modern technologies and computers in archaeological field surveys in Sistan, Iran, has proven to be necessary and beneficial. These technologies aid in preserving ancient sites, accurately recording archaeological data, reducing errors, and facilitating comprehensive analysis. The creation of accessible databases, statistics generation, graphic designs, and interdisciplinary collaborations are further advantages observed. It is recommended to explore the potential of artificial intelligence in Iranian archaeology as an unexplored area. The research has implications for cultural heritage organizations, archaeology students, and universities involved in archaeological field surveys in Sistan and Baluchistan province. Additionally, it contributes to enhancing the understanding and preservation of Iran's archaeological heritage.

Keywords: Iran, sistan, archaeological surveys, computer use, modern technologies

Procedia PDF Downloads 76