Search results for: collocational errors
507 Fluid Structure Interaction of Flow and Heat Transfer around a Microcantilever
Authors: Khalil Khanafer
Abstract:
This study emphasizes on analyzing the effect of flow conditions and the geometric variation of the microcantilever’s bluff body on the microcantilever detection capabilities within a fluidic device using a finite element fluid-structure interaction model. Such parameters include inlet velocity, flow direction, and height of the microcantilever’s supporting system within the fluidic cell. The transport equations are solved using a finite element formulation based on the Galerkin method of weighted residuals. For a flexible microcantilever, a fully coupled fluid-structure interaction (FSI) analysis is utilized and the fluid domain is described by an Arbitrary-Lagrangian–Eulerian (ALE) formulation that is fully coupled to the structure domain. The results of this study showed a profound effect on the magnitude and direction of the inlet velocity and the height of the bluff body on the deflection of the microcantilever. The vibration characteristics were also investigated in this study. This work paves the road for researchers to design efficient microcantilevers that display least errors in the measurements.Keywords: fluidic cell, FSI, microcantilever, flow direction
Procedia PDF Downloads 374506 Mitigation of Electromagnetic Interference Generated by GPIB Control-Network in AC-DC Transfer Measurement System
Authors: M. M. Hlakola, E. Golovins, D. V. Nicolae
Abstract:
The field of instrumentation electronics is undergoing an explosive growth, due to its wide range of applications. The proliferation of electrical devices in a close working proximity can negatively influence each other’s performance. The degradation in the performance is due to electromagnetic interference (EMI). This paper investigates the negative effects of electromagnetic interference originating in the General Purpose Interface Bus (GPIB) control-network of the ac-dc transfer measurement system. Remedial measures of reducing measurement errors and failure of range of industrial devices due to EMI have been explored. The ac-dc transfer measurement system was analyzed for the common-mode (CM) EMI effects. Further investigation of coupling path as well as more accurate identification of noise propagation mechanism has been outlined. To prevent the occurrence of common-mode (ground loops) which was identified between the GPIB system control circuit and the measurement circuit, a microcontroller-driven GPIB switching isolator device was designed, prototyped, programmed and validated. This mitigation technique has been explored to reduce EMI effectively.Keywords: CM, EMI, GPIB, ground loops
Procedia PDF Downloads 289505 Examining How Employee Training and Development Contribute to the Favourable Results of a Business Entity: A Conceptual Analysis
Authors: Paul Saah, Charles Mbohwa, Nelson Sizwe Madonsela
Abstract:
Organisations that want to have a competitive edge over their rivals in their industry are becoming more and more aware of the value of staff training and development programs. This conceptual study's primary goal is to determine how staff development and training affect an organization's ability to succeed. A non-empirical methodological approach was chosen because this was a conceptual study, and a thorough literature analysis was conducted to determine the contribution of staff training and development to the performance of a commercial organization. Twenty of the 100 publications about employee training and development that were obtained from Google Scholar and regarded to be more pertinent were examined for this study. The impact of employee training and development in an organization was found and documented during the analyses. According to the study's findings, some of the major advantages of staff development and training include greater productivity, the discovery of employee potential, job satisfaction, the development of skills, less supervision, a decrease in turnover and absenteeism as well as less supervision and reduction of errors and accidents. The findings show that organisations that make significant investments in the training and development of their personnel are more likely to succeed than those who do not.Keywords: impact, employment, training and development, success, business, organization
Procedia PDF Downloads 71504 Spatial Integrity of Seismic Data for Oil and Gas Exploration
Authors: Afiq Juazer Rizal, Siti Zaleha Misnan, M. Zairi M. Yusof
Abstract:
Seismic data is the fundamental tool utilized by exploration companies to determine potential hydrocarbon. However, the importance of seismic trace data will be undermined unless the geo-spatial component of the data is understood. Deriving a proposed well to be drilled from data that has positional ambiguity will jeopardize business decision and millions of dollars’ investment that every oil and gas company would like to avoid. Spatial integrity QC workflow has been introduced in PETRONAS to ensure positional errors within the seismic data are recognized throughout the exploration’s lifecycle from acquisition, processing, and seismic interpretation. This includes, amongst other tests, quantifying that the data is referenced to the appropriate coordinate reference system, survey configuration validation, and geometry loading verification. The direct outcome of the workflow implementation helps improve reliability and integrity of sub-surface geological model produced by geoscientist and provide important input to potential hazard assessment where positional accuracy is crucial. This workflow’s development initiative is part of a bigger geospatial integrity management effort, whereby nearly eighty percent of the oil and gas data are location-dependent.Keywords: oil and gas exploration, PETRONAS, seismic data, spatial integrity QC workflow
Procedia PDF Downloads 227503 The Gap between Elite Catholic Education and Inclusive Education
Authors: Viktorija Voidogaitė
Abstract:
Catholic education is based on the belief that every human being is created in the image and likeness of God. It is also influenced by the idea that the Kingdom of Heaven belongs to the humble and vulnerable. These principles emphasize the importance of serving the most vulnerable members of the Church community and promoting inclusivity without discrimination. This perspective emphasizes the need to protect the weakest members with compassion. However, realizing such an ideal in practice proves challenging, as the shortcomings and errors prevalent in any society often stem from the actions of Christians within that society. The evolution of these connections is observed throughout the historical development of Catholic education. In some European countries, Catholic education has become elitist, with limited room for inclusivity. This creates a conspicuous gap between the principles of the Evangelical community and elite Catholic schools and gymnasiums. Some schools appear to be most inclined to educate only those students who best align with their profile, leaving those needing assistance on the margins. As we advance into the third decade of the 21st century, there emerges a fundamental consideration: whether individuals who can assist the underprivileged and the infirm are being emphasized. Yet, it remains an open question whether these individuals will also possess the willingness and capability to construct a community or society that is inclusive and accessible to all.Keywords: inclusion, Catholic education, inclusive education, becoming
Procedia PDF Downloads 65502 Numerical Method of Heat Transfer in Fin Profiles
Authors: Beghdadi Lotfi, Belkacem Abdellah
Abstract:
In this work, a numerical method is proposed in order to solve the thermal performance problems of heat transfer of fins surfaces. The bidimensional temperature distribution on the longitudinal section of the fin is calculated by restoring to the finite volumes method. The heat flux dissipated by a generic profile fin is compared with the heat flux removed by the rectangular profile fin with the same length and volume. In this study, it is shown that a finite volume method for quadrilaterals unstructured mesh is developed to predict the two dimensional steady-state solutions of conduction equation, in order to determine the sinusoidal parameter values which optimize the fin effectiveness. In this scheme, based on the integration around the polygonal control volume, the derivatives of conduction equation must be converted into closed line integrals using same formulation of the Stokes theorem. The numerical results show good agreement with analytical results. To demonstrate the accuracy of the method, the absolute and root-mean square errors versus the grid size are examined quantitatively.Keywords: Stokes theorem, unstructured grid, heat transfer, complex geometry
Procedia PDF Downloads 406501 Dual-Channel Reliable Breast Ultrasound Image Classification Based on Explainable Attribution and Uncertainty Quantification
Authors: Haonan Hu, Shuge Lei, Dasheng Sun, Huabin Zhang, Kehong Yuan, Jian Dai, Jijun Tang
Abstract:
This paper focuses on the classification task of breast ultrasound images and conducts research on the reliability measurement of classification results. A dual-channel evaluation framework was developed based on the proposed inference reliability and predictive reliability scores. For the inference reliability evaluation, human-aligned and doctor-agreed inference rationals based on the improved feature attribution algorithm SP-RISA are gracefully applied. Uncertainty quantification is used to evaluate the predictive reliability via the test time enhancement. The effectiveness of this reliability evaluation framework has been verified on the breast ultrasound clinical dataset YBUS, and its robustness is verified on the public dataset BUSI. The expected calibration errors on both datasets are significantly lower than traditional evaluation methods, which proves the effectiveness of the proposed reliability measurement.Keywords: medical imaging, ultrasound imaging, XAI, uncertainty measurement, trustworthy AI
Procedia PDF Downloads 103500 Velocity Logs Error Reduction for In-Service Calibration of Vessel Performance Indicators
Authors: Maria Tsompanoglou, Dimitris Armenis
Abstract:
Vessel behavior in different operational and weather conditions constitutes the main area of interest for the ship operator. Ship speed and fuel consumption are the most decisive parameters in this respect, as their correlation provides information about the economic and environmental efficiency of the vessel, becoming the basis of decision making in terms of maintenance and trading. In the analysis of vessel operational profile for the evaluation of fuel consumption and the equivalent CO2 emissions footprint, the indications of Speed Through Water are widely used. The seasonal and regional variations in seawater characteristics, which are available nowadays, can provide the basis for accurate estimation of the errors in Speed Through Water indications at any time. Accuracy in the speed value on a route basis can enable operator identify the ship fuel and propulsion efficiency and proceed with improvements. This paper discusses case studies, where the actual vessel speed was corrected by a post-processing algorithm. The effects of the vessel correction to standard Key Performance Indicators, as well as operational findings not identified earlier, are also discussed.Keywords: data analytics, MATLAB, vessel performance monitoring, speed through water
Procedia PDF Downloads 302499 Scalable CI/CD and Scalable Automation: Assisting in Optimizing Productivity and Fostering Delivery Expansion
Authors: Solanki Ravirajsinh, Kudo Kuniaki, Sharma Ankit, Devi Sherine, Kuboshima Misaki, Tachi Shuntaro
Abstract:
In software development life cycles, the absence of scalable CI/CD significantly impacts organizations, leading to increased overall maintenance costs, prolonged release delivery times, heightened manual efforts, and difficulties in meeting tight deadlines. Implementing CI/CD with standard serverless technologies using cloud services overcomes all the above-mentioned issues and helps organizations improve efficiency and faster delivery without the need to manage server maintenance and capacity. By integrating scalable CI/CD with scalable automation testing, productivity, quality, and agility are enhanced while reducing the need for repetitive work and manual efforts. Implementing scalable CI/CD for development using cloud services like ECS (Container Management Service), AWS Fargate, ECR (to store Docker images with all dependencies), Serverless Computing (serverless virtual machines), Cloud Log (for monitoring errors and logs), Security Groups (for inside/outside access to the application), Docker Containerization (Docker-based images and container techniques), Jenkins (CI/CD build management tool), and code management tools (GitHub, Bitbucket, AWS CodeCommit) can efficiently handle the demands of diverse development environments and are capable of accommodating dynamic workloads, increasing efficiency for faster delivery with good quality. CI/CD pipelines encourage collaboration among development, operations, and quality assurance teams by providing a centralized platform for automated testing, deployment, and monitoring. Scalable CI/CD streamlines the development process by automatically fetching the latest code from the repository every time the process starts, building the application based on the branches, testing the application using a scalable automation testing framework, and deploying the builds. Developers can focus more on writing code and less on managing infrastructure as it scales based on the need. Serverless CI/CD eliminates the need to manage and maintain traditional CI/CD infrastructure, such as servers and build agents, reducing operational overhead and allowing teams to allocate resources more efficiently. Scalable CI/CD adjusts the application's scale according to usage, thereby alleviating concerns about scalability, maintenance costs, and resource needs. Creating scalable automation testing using cloud services (ECR, ECS Fargate, Docker, EFS, Serverless Computing) helps organizations run more than 500 test cases in parallel, aiding in the detection of race conditions, performance issues, and reducing execution time. Scalable CI/CD offers flexibility, dynamically adjusting to varying workloads and demands, allowing teams to scale resources up or down as needed. It optimizes costs by only paying for the resources as they are used and increases reliability. Scalable CI/CD pipelines employ automated testing and validation processes to detect and prevent errors early in the development cycle.Keywords: achieve parallel execution, cloud services, scalable automation testing, scalable continuous integration and deployment
Procedia PDF Downloads 48498 A Deterministic Approach for Solving the Hull and White Interest Rate Model with Jump Process
Authors: Hong-Ming Chen
Abstract:
This work considers the resolution of the Hull and White interest rate model with the jump process. A deterministic process is adopted to model the random behavior of interest rate variation as deterministic perturbations, which is depending on the time t. The Brownian motion and jumps uncertainty are denoted as the integral functions piecewise constant function w(t) and point function θ(t). It shows that the interest rate function and the yield function of the Hull and White interest rate model with jump process can be obtained by solving a nonlinear semi-infinite programming problem. A relaxed cutting plane algorithm is then proposed for solving the resulting optimization problem. The method is calibrated for the U.S. treasury securities at 3-month data and is used to analyze several effects on interest rate prices, including interest rate variability, and the negative correlation between stock returns and interest rates. The numerical results illustrate that our approach essentially generates the yield functions with minimal fitting errors and small oscillation.Keywords: optimization, interest rate model, jump process, deterministic
Procedia PDF Downloads 161497 Fault-Detection and Self-Stabilization Protocol for Wireless Sensor Networks
Authors: Ather Saeed, Arif Khan, Jeffrey Gosper
Abstract:
Sensor devices are prone to errors and sudden node failures, which are difficult to detect in a timely manner when deployed in real-time, hazardous, large-scale harsh environments and in medical emergencies. Therefore, the loss of data can be life-threatening when the sensed phenomenon is not disseminated due to sudden node failure, battery depletion or temporary malfunctioning. We introduce a set of partial differential equations for localizing faults, similar to Green’s and Maxwell’s equations used in Electrostatics and Electromagnetism. We introduce a node organization and clustering scheme for self-stabilizing sensor networks. Green’s theorem is applied to regions where the curve is closed and continuously differentiable to ensure network connectivity. Experimental results show that the proposed GTFD (Green’s Theorem fault-detection and Self-stabilization) protocol not only detects faulty nodes but also accurately generates network stability graphs where urgent intervention is required for dynamically self-stabilizing the network.Keywords: Green’s Theorem, self-stabilization, fault-localization, RSSI, WSN, clustering
Procedia PDF Downloads 77496 Incomplete Existing Algebra to Support Mathematical Computations
Authors: Ranjit Biswas
Abstract:
The existing subject Algebra is incomplete to support mathematical computations being done by scientists of all areas: Mathematics, Physics, Statistics, Chemistry, Space Science, Cosmology etc. even starting from the era of great Einstein. A huge hidden gap in the subject ‘Algebra’ is unearthed. All the scientists today, including mathematicians, physicists, chemists, statisticians, cosmologists, space scientists, and economists, even starting from the great Einstein, are lucky that they got results without facing any contradictions or without facing computational errors. Most surprising is that the results of all scientists, including Nobel Prize winners, were proved by them by doing experiments too. But in this paper, it is rigorously justified that they all are lucky. An algebraist can define an infinite number of new algebraic structures. The objective of the work in this paper is not just for the sake of defining a distinct algebraic structure, but to recognize and identify a major gap of the subject ‘Algebra’ lying hidden so far in the existing vast literature of it. The objective of this work is to fix the unearthed gap. Consequently, a different algebraic structure called ‘Region’ has been introduced, and its properties are studied.Keywords: region, ROR, RORR, region algebra
Procedia PDF Downloads 54495 A Peg Board with Photo-Reflectors to Detect Peg Insertion and Pull-Out Moments
Authors: Hiroshi Kinoshita, Yasuto Nakanishi, Ryuhei Okuno, Toshio Higashi
Abstract:
Various kinds of pegboards have been developed and used widely in research and clinics of rehabilitation for evaluation and training of patient’s hand function. A common measure in these peg boards is a total time of performance execution assessed by a tester’s stopwatch. Introduction of electrical and automatic measurement technology to the apparatus, on the other hand, has been delayed. The present work introduces the development of a pegboard with an electric sensor to detect moments of individual peg’s insertion and removal. The work also gives fundamental data obtained from a group of healthy young individuals who performed peg transfer tasks using the pegboard developed. Through trails and errors in pilot tests, two 10-hole peg-board boxes installed with a small photo-reflector and a DC amplifier at the bottom of each hole were designed and built by the present authors. The amplified electric analogue signals from the 20 reflectors were automatically digitized at 500 Hz per channel, and stored in a PC. The boxes were set on a test table at different distances (25, 50, 75, and 125 mm) in parallel to examine the effect of hole-to-hole distance. Fifty healthy young volunteers (25 in each gender) as subjects of the study performed successive fast 80 time peg transfers at each distance using their dominant and non-dominant hands. The data gathered showed a clear-cut light interruption/continuation moment by the pegs, allowing accurately (no tester’s error involved) and precisely (an order of milliseconds) to determine the pull out and insertion times of each peg. This further permitted computation of individual peg movement duration (PMD: from peg-lift-off to insertion) apart from hand reaching duration (HRD: from peg insertion to lift-off). An accidental drop of a peg led to an exceptionally long ( < mean + 3 SD) PMD, which was readily detected from an examination of data distribution. The PMD data were commonly right-skewed, suggesting that the median can be a better estimate of individual PMD than the mean. Repeated measures ANOVA using the median values revealed significant hole-to-hole distance, and hand dominance effects, suggesting that these need to be fixed in the accurate evaluation of PMD. The gender effect was non-significant. Performance consistency was also evaluated by the use of quartile variation coefficient values, which revealed no gender, hole-to-hole, and hand dominance effects. The measurement reliability was further examined using interclass correlation obtained from 14 subjects who performed the 25 and 125 mm hole distance tasks at two 7-10 days separate test sessions. Inter-class correlation values between the two tests showed fair reliability for PMD (0.65-0.75), and for HRD (0.77-0.94). We concluded that a sensor peg board developed in the present study could provide accurate (excluding tester’s errors), and precise (at a millisecond rate) time information of peg movement separated from that used for hand movement. It could also easily detect and automatically exclude erroneous execution data from his/her standard data. These would lead to a better evaluation of hand dexterity function compared to the widely used conventional used peg boards.Keywords: hand, dexterity test, peg movement time, performance consistency
Procedia PDF Downloads 134494 Variations of the Modal Characteristics of the Feeding Stage with Different Preloaded Linear Guide
Authors: Jui-Pui Hung, Yong-Run Chen, Wei-Cheng Shih, Chun-Wei Lin
Abstract:
This study was aimed to assess the variations of the modal characteristics of the feeding stage with different linear guide modulus. The dynamic characteristics of the feeding stage were characterized in terms of the modal stiffness, modal frequency and modal damping, which are assessed from the vibration tests. According to the experimental measurements, the actual preload of the linear guide modulus was found to deviate from the rated values as setting in factory. This may be due to the assemblage errors of guide modules. For the stage with linear guides, the dynamic stiffness was affected to change by the preload set on the rolling balls. The variation of the dynamic stiffness at first and second modes is 20.8 and 10.5%, respectively when the linear guide preload is adjusted from medium and high amount. But the modal damping ratio is reduced by 8.97 and 9.65%, respectively. For high-frequency mode, the modal stiffness increases by 171.2% and the damping ratio reduced by 34.4%. Current results demonstrate the importance in the determining the preloaded amount of linear guide modulus in practical application.Keywords: contact stiffness, feeding stage, linear guides, modal characteristics, pre-load
Procedia PDF Downloads 430493 Enhance Security in XML Databases: XLog File for Severity-Aware Trust-Based Access Control
Authors: A: Asmawi, L. S. Affendey, N. I. Udzir, R. Mahmod
Abstract:
The topic of enhancing security in XML databases is important as it includes protecting sensitive data and providing a secure environment to users. In order to improve security and provide dynamic access control for XML databases, we presented XLog file to calculate user trust values by recording users’ bad transaction, errors and query severities. Severity-aware trust-based access control for XML databases manages the access policy depending on users' trust values and prevents unauthorized processes, malicious transactions and insider threats. Privileges are automatically modified and adjusted over time depending on user behaviour and query severity. Logging in database is an important process and is used for recovery and security purposes. In this paper, the Xlog file is presented as a dynamic and temporary log file for XML databases to enhance the level of security.Keywords: XML database, trust-based access control, severity-aware, trust values, log file
Procedia PDF Downloads 300492 A Comparison of Bias Among Relaxed Divisor Methods Using 3 Bias Measurements
Authors: Sumachaya Harnsukworapanich, Tetsuo Ichimori
Abstract:
The apportionment method is used by many countries, to calculate the distribution of seats in political bodies. For example, this method is used in the United States (U.S.) to distribute house seats proportionally based on the population of the electoral district. Famous apportionment methods include the divisor methods called the Adams Method, Dean Method, Hill Method, Jefferson Method and Webster Method. Sometimes the results from the implementation of these divisor methods are unfair and include errors. Therefore, it is important to examine the optimization of this method by using a bias measurement to figure out precise and fair results. In this research we investigate the bias of divisor methods in the U.S. Houses of Representatives toward large and small states by applying the Stolarsky Mean Method. We compare the bias of the apportionment method by using two famous bias measurements: The Balinski and Young measurement and the Ernst measurement. Both measurements have a formula for large and small states. The Third measurement however, which was created by the researchers, did not factor in the element of large and small states into the formula. All three measurements are compared and the results show that our measurement produces similar results to the other two famous measurements.Keywords: apportionment, bias, divisor, fair, measurement
Procedia PDF Downloads 366491 Using Optimal Control Method to Investigate the Stability and Transparency of a Nonlinear Teleoperation System with Time Varying Delay
Authors: Abasali Amini, Alireza Mirbagheri, Amir Homayoun Jafari
Abstract:
In this paper, a new structure for teleoperation systems with time varying delay has been modeled and proposed. A random time varying the delay of up to 150 msec is simulated in teleoperation channel of both masters to slave and vice versa. The system stability and transparency have been investigated, comparing the result of a PID controller and an optimal controller on each master and slave sub-systems separately. The controllers have been designed in slave subsystem for reducing position errors between master and slave, and another controller has been designed in the master subsystem to establish stability, transparency and force tracking. Results have been compared together. The results showed PID controller is appropriate in position tracking, but force response oscillates in contact with the environment. We showed the optimal control established position tracking properly. Also, force tracking is achieved in this controller appropriately.Keywords: optimal control, time varying delay, teleoperation systems, stability and transparency
Procedia PDF Downloads 257490 An Image Segmentation Algorithm for Gradient Target Based on Mean-Shift and Dictionary Learning
Authors: Yanwen Li, Shuguo Xie
Abstract:
In electromagnetic imaging, because of the diffraction limited system, the pixel values could change slowly near the edge of the image targets and they also change with the location in the same target. Using traditional digital image segmentation methods to segment electromagnetic gradient images could result in lots of errors because of this change in pixel values. To address this issue, this paper proposes a novel image segmentation and extraction algorithm based on Mean-Shift and dictionary learning. Firstly, the preliminary segmentation results from adaptive bandwidth Mean-Shift algorithm are expanded, merged and extracted. Then the overlap rate of the extracted image block is detected before determining a segmentation region with a single complete target. Last, the gradient edge of the extracted targets is recovered and reconstructed by using a dictionary-learning algorithm, while the final segmentation results are obtained which are very close to the gradient target in the original image. Both the experimental results and the simulated results show that the segmentation results are very accurate. The Dice coefficients are improved by 70% to 80% compared with the Mean-Shift only method.Keywords: gradient image, segmentation and extract, mean-shift algorithm, dictionary iearning
Procedia PDF Downloads 267489 Inadequate Requirements Engineering Process: A Key Factor for Poor Software Development in Developing Nations: A Case Study
Authors: K. Adu Michael, K. Alese Boniface
Abstract:
Developing a reliable and sustainable software products is today a big challenge among up–coming software developers in Nigeria. The inability to develop a comprehensive problem statement needed to execute proper requirements engineering process is missing. The need to describe the ‘what’ of a system in one document, written in a natural language is a major step in the overall process of Software Engineering. Requirements Engineering is a process use to discover, analyze and validate system requirements. This process is needed in reducing software errors at the early stage of the development of software. The importance of each of the steps in Requirements Engineering is clearly explained in the context of using detailed problem statement from client/customer to get an overview of an existing system along with expectations from the new system. This paper elicits inadequate Requirements Engineering principle as the major cause of poor software development in developing nations using a case study of final year computer science students of a tertiary-education institution in Nigeria.Keywords: client/customer, problem statement, requirements engineering, software developers
Procedia PDF Downloads 408488 Application Methodology for the Generation of 3D Thermal Models Using UAV Photogrammety and Dual Sensors for Mining/Industrial Facilities Inspection
Authors: Javier Sedano-Cibrián, Julio Manuel de Luis-Ruiz, Rubén Pérez-Álvarez, Raúl Pereda-García, Beatriz Malagón-Picón
Abstract:
Structural inspection activities are necessary to ensure the correct functioning of infrastructures. Unmanned Aerial Vehicle (UAV) techniques have become more popular than traditional techniques. Specifically, UAV Photogrammetry allows time and cost savings. The development of this technology has permitted the use of low-cost thermal sensors in UAVs. The representation of 3D thermal models with this type of equipment is in continuous evolution. The direct processing of thermal images usually leads to errors and inaccurate results. A methodology is proposed for the generation of 3D thermal models using dual sensors, which involves the application of visible Red-Blue-Green (RGB) and thermal images in parallel. Hence, the RGB images are used as the basis for the generation of the model geometry, and the thermal images are the source of the surface temperature information that is projected onto the model. Mining/industrial facilities representations that are obtained can be used for inspection activities.Keywords: aerial thermography, data processing, drone, low-cost, point cloud
Procedia PDF Downloads 145487 Statistical Classification, Downscaling and Uncertainty Assessment for Global Climate Model Outputs
Authors: Queen Suraajini Rajendran, Sai Hung Cheung
Abstract:
Statistical down scaling models are required to connect the global climate model outputs and the local weather variables for climate change impact prediction. For reliable climate change impact studies, the uncertainty associated with the model including natural variability, uncertainty in the climate model(s), down scaling model, model inadequacy and in the predicted results should be quantified appropriately. In this work, a new approach is developed by the authors for statistical classification, statistical down scaling and uncertainty assessment and is applied to Singapore rainfall. It is a robust Bayesian uncertainty analysis methodology and tools based on coupling dependent modeling error with classification and statistical down scaling models in a way that the dependency among modeling errors will impact the results of both classification and statistical down scaling model calibration and uncertainty analysis for future prediction. Singapore data are considered here and the uncertainty and prediction results are obtained. From the results obtained, directions of research for improvement are briefly presented.Keywords: statistical downscaling, global climate model, climate change, uncertainty
Procedia PDF Downloads 371486 Assisted Video Colorization Using Texture Descriptors
Authors: Andre Peres Ramos, Franklin Cesar Flores
Abstract:
Colorization is the process of add colors to a monochromatic image or video. Usually, the process involves to segment the image in regions of interest and then apply colors to each one, for videos, this process is repeated for each frame, which makes it a tedious and time-consuming job. We propose a new assisted method for video colorization; the user only has to colorize one frame, and then the colors are propagated to following frames. The user can intervene at any time to correct eventual errors in color assignment. The method consists of to extract intensity and texture descriptors from the frames and then perform a feature matching to determine the best color for each segment. To reduce computation time and give a better spatial coherence we narrow the area of search and give weights for each feature to emphasize texture descriptors. To give a more natural result, we use an optimization algorithm to make the color propagation. Experimental results in several image sequences, compared to others existing methods, demonstrates that the proposed method perform a better colorization with less time and user interference.Keywords: colorization, feature matching, texture descriptors, video segmentation
Procedia PDF Downloads 162485 Medicompills Architecture: A Mathematical Precise Tool to Reduce the Risk of Diagnosis Errors on Precise Medicine
Authors: Adriana Haulica
Abstract:
Powered by Machine Learning, Precise medicine is tailored by now to use genetic and molecular profiling, with the aim of optimizing the therapeutic benefits for cohorts of patients. As the majority of Machine Language algorithms come from heuristics, the outputs have contextual validity. This is not very restrictive in the sense that medicine itself is not an exact science. Meanwhile, the progress made in Molecular Biology, Bioinformatics, Computational Biology, and Precise Medicine, correlated with the huge amount of human biology data and the increase in computational power, opens new healthcare challenges. A more accurate diagnosis is needed along with real-time treatments by processing as much as possible from the available information. The purpose of this paper is to present a deeper vision for the future of Artificial Intelligence in Precise medicine. In fact, actual Machine Learning algorithms use standard mathematical knowledge, mostly Euclidian metrics and standard computation rules. The loss of information arising from the classical methods prevents obtaining 100% evidence on the diagnosis process. To overcome these problems, we introduce MEDICOMPILLS, a new architectural concept tool of information processing in Precise medicine that delivers diagnosis and therapy advice. This tool processes poly-field digital resources: global knowledge related to biomedicine in a direct or indirect manner but also technical databases, Natural Language Processing algorithms, and strong class optimization functions. As the name suggests, the heart of this tool is a compiler. The approach is completely new, tailored for omics and clinical data. Firstly, the intrinsic biological intuition is different from the well-known “a needle in a haystack” approach usually used when Machine Learning algorithms have to process differential genomic or molecular data to find biomarkers. Also, even if the input is seized from various types of data, the working engine inside the MEDICOMPILLS does not search for patterns as an integrative tool. This approach deciphers the biological meaning of input data up to the metabolic and physiologic mechanisms, based on a compiler with grammars issued from bio-algebra-inspired mathematics. It translates input data into bio-semantic units with the help of contextual information iteratively until Bio-Logical operations can be performed on the base of the “common denominator “rule. The rigorousness of MEDICOMPILLS comes from the structure of the contextual information on functions, built to be analogous to mathematical “proofs”. The major impact of this architecture is expressed by the high accuracy of the diagnosis. Detected as a multiple conditions diagnostic, constituted by some main diseases along with unhealthy biological states, this format is highly suitable for therapy proposal and disease prevention. The use of MEDICOMPILLS architecture is highly beneficial for the healthcare industry. The expectation is to generate a strategic trend in Precise medicine, making medicine more like an exact science and reducing the considerable risk of errors in diagnostics and therapies. The tool can be used by pharmaceutical laboratories for the discovery of new cures. It will also contribute to better design of clinical trials and speed them up.Keywords: bio-semantic units, multiple conditions diagnosis, NLP, omics
Procedia PDF Downloads 70484 Measurement of Convective Heat Transfer from a Vertical Flat Plate Using Mach-Zehnder Interferometer with Wedge Fringe Setting
Authors: Divya Haridas, C. B. Sobhan
Abstract:
Laser interferometric methods have been utilized for the measurement of natural convection heat transfer from a heated vertical flat plate, in the investigation presented here. The study mainly aims at comparing two different fringe orientations in the wedge fringe setting of Mach-Zehnder interferometer (MZI), used for the measurements. The interference fringes are set in horizontal and vertical orientations with respect to the heated surface, and two different fringe analysis methods, namely the stepping method and the method proposed by Naylor and Duarte, are used to obtain the heat transfer coefficients. The experimental system is benchmarked with theoretical results, thus validating its reliability in heat transfer measurements. The interference fringe patterns are analyzed digitally using MATLAB 7 and MOTIC Plus softwares, which ensure improved efficiency in fringe analysis, hence reducing the errors associated with conventional fringe tracing. The work also discuss the relative merits and limitations of the two methods used.Keywords: Mach-Zehnder interferometer (MZI), natural convection, Naylor method, Vertical Flat Plate
Procedia PDF Downloads 365483 Presenting a Model for Predicting the State of Being Accident-Prone of Passages According to Neural Network and Spatial Data Analysis
Authors: Hamd Rezaeifar, Hamid Reza Sahriari
Abstract:
Accidents are considered to be one of the challenges of modern life. Due to the fact that the victims of this problem and also internal transportations are getting increased day by day in Iran, studying effective factors of accidents and identifying suitable models and parameters about this issue are absolutely essential. The main purpose of this research has been studying the factors and spatial data affecting accidents of Mashhad during 2007- 2008. In this paper it has been attempted to – through matching spatial layers on each other and finally by elaborating them with the place of accident – at the first step by adding landmarks of the accident and through adding especial fields regarding the existence or non-existence of effective phenomenon on accident, existing information banks of the accidents be completed and in the next step by means of data mining tools and analyzing by neural network, the relationship between these data be evaluated and a logical model be designed for predicting accident-prone spots with minimum error. The model of this article has a very accurate prediction in low-accident spots; yet it has more errors in accident-prone regions due to lack of primary data.Keywords: accident, data mining, neural network, GIS
Procedia PDF Downloads 48482 The Use of Modern Technologies and Computers in the Archaeological Surveys of Sistan in Eastern Iran
Authors: Mahyar MehrAfarin
Abstract:
The Sistan region in eastern Iran is a significant archaeological area in Iran and the Middle East, encompassing 10,000 square kilometers. Previous archeological field surveys have identified 1662 ancient sites dating from prehistoric periods to the Islamic period. Research Aim: This article aims to explore the utilization of modern technologies and computers in archaeological field surveys in Sistan, Iran, and the benefits derived from their implementation. Methodology: The research employs a descriptive-analytical approach combined with field methods. New technologies and software, such as GPS, drones, magnetometers, equipped cameras, satellite images, and software programs like GIS, Map source, and Excel, were utilized to collect information and analyze data. Findings: The use of modern technologies and computers in archaeological field surveys proved to be essential. Traditional archaeological activities, such as excavation and field surveys, are time-consuming and costly. Employing modern technologies helps in preserving ancient sites, accurately recording archaeological data, reducing errors and mistakes, and facilitating correct and accurate analysis. Creating a comprehensive and accessible database, generating statistics, and producing graphic designs and diagrams are additional advantages derived from the use of efficient technologies in archaeology. Theoretical Importance: The integration of computers and modern technologies in archaeology contributes to interdisciplinary collaborations and facilitates the involvement of specialists from various fields, such as geography, history, art history, anthropology, laboratory sciences, and computer engineering. The utilization of computers in archaeology spanned across diverse areas, including database creation, statistical analysis, graphics implementation, laboratory and engineering applications, and even artificial intelligence, which remains an unexplored area in Iranian archaeology. Data Collection and Analysis Procedures: Information was collected using modern technologies and software, capturing geographic coordinates, aerial images, archeogeophysical data, and satellite images. This data was then inputted into various software programs for analysis, including GIS, Map source, and Excel. The research employed both descriptive and analytical methods to present findings effectively. Question Addressed: The primary question addressed in this research is how the use of modern technologies and computers in archeological field surveys in Sistan, Iran, can enhance archaeological data collection, preservation, analysis, and accessibility. Conclusion: The utilization of modern technologies and computers in archaeological field surveys in Sistan, Iran, has proven to be necessary and beneficial. These technologies aid in preserving ancient sites, accurately recording archaeological data, reducing errors, and facilitating comprehensive analysis. The creation of accessible databases, statistics generation, graphic designs, and interdisciplinary collaborations are further advantages observed. It is recommended to explore the potential of artificial intelligence in Iranian archaeology as an unexplored area. The research has implications for cultural heritage organizations, archaeology students, and universities involved in archaeological field surveys in Sistan and Baluchistan province. Additionally, it contributes to enhancing the understanding and preservation of Iran's archaeological heritage.Keywords: Iran, sistan, archaeological surveys, computer use, modern technologies
Procedia PDF Downloads 82481 Fall Avoidance Control of Wheeled Inverted Pendulum Type Robotic Wheelchair While Climbing Stairs
Authors: Nan Ding, Motoki Shino, Nobuyasu Tomokuni, Genki Murata
Abstract:
The wheelchair is the major means of transport for physically disabled people. However, it cannot overcome architectural barriers such as curbs and stairs. In this paper, the authors proposed a method to avoid falling down of a wheeled inverted pendulum type robotic wheelchair for climbing stairs. The problem of this system is that the feedback gain of the wheels cannot be set high due to modeling errors and gear backlash, which results in the movement of wheels. Therefore, the wheels slide down the stairs or collide with the side of the stairs, and finally the wheelchair falls down. To avoid falling down, the authors proposed a slider control strategy based on skyhook model in order to decrease the movement of wheels, and a rotary link control strategy based on the staircase dimensions in order to avoid collision or slide down. The effectiveness of the proposed fall avoidance control strategy was validated by ODE simulations and the prototype wheelchair.Keywords: EPW, fall avoidance control, skyhook, wheeled inverted pendulum
Procedia PDF Downloads 334480 Deep-Learning to Generation of Weights for Image Captioning Using Part-of-Speech Approach
Authors: Tiago do Carmo Nogueira, Cássio Dener Noronha Vinhal, Gélson da Cruz Júnior, Matheus Rudolfo Diedrich Ullmann
Abstract:
Generating automatic image descriptions through natural language is a challenging task. Image captioning is a task that consistently describes an image by combining computer vision and natural language processing techniques. To accomplish this task, cutting-edge models use encoder-decoder structures. Thus, Convolutional Neural Networks (CNN) are used to extract the characteristics of the images, and Recurrent Neural Networks (RNN) generate the descriptive sentences of the images. However, cutting-edge approaches still suffer from problems of generating incorrect captions and accumulating errors in the decoders. To solve this problem, we propose a model based on the encoder-decoder structure, introducing a module that generates the weights according to the importance of the word to form the sentence, using the part-of-speech (PoS). Thus, the results demonstrate that our model surpasses state-of-the-art models.Keywords: gated recurrent units, caption generation, convolutional neural network, part-of-speech
Procedia PDF Downloads 103479 Using Analytical Hierarchy Process and TOPSIS Approaches in Designing a Finite Element Analysis Automation Program
Authors: Ming Wen, Nasim Nezamoddini
Abstract:
Sophisticated numerical simulations like finite element analysis (FEA) involve a complicated process from model setup to post-processing tasks that require replication of time-consuming steps. Utilizing FEA automation program simplifies the complexity of the involved steps while minimizing human errors in analysis set up, calculations, and results processing. One of the main challenges in designing FEA automation programs is to identify user requirements and link them to possible design alternatives. This paper presents a decision-making framework to design a Python based FEA automation program for modal analysis, frequency response analysis, and random vibration fatigue (RVF) analysis procedures. Analytical hierarchy process (AHP) and technique for order preference by similarity to ideal solution (TOPSIS) are applied to evaluate design alternatives considering the feedback received from experts and program users.Keywords: finite element analysis, FEA, random vibration fatigue, process automation, analytical hierarchy process, AHP, TOPSIS, multiple-criteria decision-making, MCDM
Procedia PDF Downloads 113478 Comparison of Different Intraocular Lens Power Calculation Formulas in People With Very High Myopia
Authors: Xia Chen, Yulan Wang
Abstract:
purpose: To compare the accuracy of Haigis, SRK/T, T2, Holladay 1, Hoffer Q, Barrett Universal II, Emmetropia Verifying Optical (EVO) and Kane for intraocular lens power calculation in patients with axial length (AL) ≥ 28 mm. Methods: In this retrospective single-center study, 50 eyes of 41 patients with AL ≥ 28 mm that underwent uneventful cataract surgery were enrolled. The actual postoperative refractive results were compared to the predicted refraction calculated with different formulas (Haigis, SRK/T, T2, Holladay 1, Hoffer Q, Barrett Universal II, EVO and Kane). The mean absolute prediction errors (MAE) 1 month postoperatively were compared. Results: The MAE of different formulas were as follows: Haigis (0.509), SRK/T (0.705), T2 (0.999), Holladay 1 (0.714), Hoffer Q (0.583), Barrett Universal II (0.552), EVO (0.463) and Kane (0.441). No significant difference was found among the different formulas (P = .122). The Kane and EVO formulas achieved the lowest level of mean prediction error (PE) and median absolute error (MedAE) (p < 0.05). Conclusion: The Kane and EVO formulas had a better success rate than others in predicting IOL power in high myopic eyes with AL longer than 28 mm in this study.Keywords: cataract, power calculation formulas, intraocular lens, long axial length
Procedia PDF Downloads 87