Search results for: geometric search algorithm
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5651

Search results for: geometric search algorithm

1601 Compliance of Systematic Reviews in Ophthalmology with the PRISMA Statement

Authors: Seon-Young Lee, Harkiran Sagoo, Reem Farwana, Katharine Whitehurst, Alex Fowler, Riaz Agha

Abstract:

Background/Aims: Systematic reviews and meta-analysis are becoming increasingly important way of summarizing research evidence. Researches in ophthalmology may represent further challenges, due to their potential complexity in study design. The aim of our study was to determine the reporting quality of systematic reviews and meta-analysis in ophthalmology with the PRISMA statement, by assessing the articles published between 2010 and 2015 from five major journals with the highest impact factor. Methods: MEDLINE and EMBASE were used to search systematic reviews published between January 2010 and December 2015, in 5 major ophthalmology journals: Progress in Retinal and Eye Research, Ophthalmology, Archives of Ophthalmology, American Journal of Ophthalmology, Journal of the American Optometric Association. Screening, identification, and scoring of articles were performed independently by two teams, followed by statistical analysis including the median, range, and 95% CIs. Results: 115 articles were involved. The median PRISMA score was 15 of 27 items (56%), with a range of 5-26 (19-96%) and 95% CI 13.9-16.1 (51-60%). Compliance was highest in items related to the description of rationale (item 3,100%) and inclusion of a structured summary in the abstract (item 2, 90%), while poorest in indication of review protocol and registration (item 5, 9%), specification of risk of bias affecting the cumulative evidence (item 15, 24%) and description of clear objectives in introduction (item 4, 26%). Conclusion: The reporting quality of systematic reviews and meta-analysis in ophthalmology need significant improvement. While the use of PRISMA criteria as a guideline before journal submission is recommended, additional research identifying potential barriers may be required to improve the compliance to the PRISMA guidelines.

Keywords: systematic reviews, meta-analysis, research methodology, reporting quality, PRISMA, ophthalmology

Procedia PDF Downloads 260
1600 Non-Local Simultaneous Sparse Unmixing for Hyperspectral Data

Authors: Fanqiang Kong, Chending Bian

Abstract:

Sparse unmixing is a promising approach in a semisupervised fashion by assuming that the observed pixels of a hyperspectral image can be expressed in the form of linear combination of only a few pure spectral signatures (end members) in an available spectral library. However, the sparse unmixing problem still remains a great challenge at finding the optimal subset of endmembers for the observed data from a large standard spectral library, without considering the spatial information. Under such circumstances, a sparse unmixing algorithm termed as non-local simultaneous sparse unmixing (NLSSU) is presented. In NLSSU, the non-local simultaneous sparse representation method for endmember selection of sparse unmixing, is used to finding the optimal subset of endmembers for the similar image patch set in the hyperspectral image. And then, the non-local means method, as a regularizer for abundance estimation of sparse unmixing, is used to exploit the abundance image non-local self-similarity. Experimental results on both simulated and real data demonstrate that NLSSU outperforms the other algorithms, with a better spectral unmixing accuracy.

Keywords: hyperspectral unmixing, simultaneous sparse representation, sparse regression, non-local means

Procedia PDF Downloads 239
1599 Life Cycle Analysis of the Antibacterial Gel Product Using Iso 14040 and Recipe 2016 Method

Authors: Pablo Andres Flores Siguenza, Noe Rodrigo Guaman Guachichullca

Abstract:

Sustainable practices have received increasing attention from academics and companies in recent decades due to, among many factors, the market advantages they generate, global commitments, and policies aimed at reducing greenhouse gas emissions, addressing resource scarcity, and rethinking waste management. The search for ways to promote sustainability leads industries to abandon classical methods and resort to the use of innovative strategies, which in turn are based on quantitative analysis methods and tools such as life cycle analysis (LCA), which is the basis for sustainable production and consumption, since it is a method that analyzes objectively, methodically, systematically, and scientifically the environmental impact caused by a process/product during its entire life cycle. The objective of this study is to develop an LCA of the antibacterial gel product throughout its entire supply chain (SC) under the methodology of ISO 14044 with the help of Gabi software and the Recipe 2016 method. The selection of the case study product was made based on its relevance in the current context of the COVID-19 pandemic and its exponential increase in production. For the development of the LCA, data from a Mexican company are used, and 3 scenarios are defined to obtain the midpoint and endpoint environmental impacts both by phases and globally. As part of the results, the most outstanding environmental impact categories are climate change, fossil fuel depletion, and terrestrial ecotoxicity, and the stage that generates the most pollution in the entire SC is the extraction of raw materials. The study serves as a basis for the development of different sustainability strategies, demonstrates the usefulness of an LCA, and agrees with different authors on the role and importance of this methodology in sustainable development.

Keywords: sustainability, sustainable development, life cycle analysis, environmental impact, antibacterial gel

Procedia PDF Downloads 42
1598 Optimization of Multiplier Extraction Digital Filter On FPGA

Authors: Shiksha Jain, Ramesh Mishra

Abstract:

One of the most widely used complex signals processing operation is filtering. The most important FIR digital filter are widely used in DSP for filtering to alter the spectrum according to some given specifications. Power consumption and Area complexity in the algorithm of Finite Impulse Response (FIR) filter is mainly caused by multipliers. So we present a multiplier less technique (DA technique). In this technique, precomputed value of inner product is stored in LUT. Which are further added and shifted with number of iterations equal to the precision of input sample. But the exponential growth of LUT with the order of FIR filter, in this basic structure, makes it prohibitive for many applications. The significant area and power reduction over traditional Distributed Arithmetic (DA) structure is presented in this paper, by the use of slicing of LUT to the desired length. An architecture of 16 tap FIR filter is presented, with different length of slice of LUT. The result of FIR Filter implementation on Xilinx ISE synthesis tool (XST) vertex-4 FPGA Tool by using proposed method shows the increase of the maximum frequency, the decrease of the resources as usage saving in area with more number of slices and the reduction dynamic power.

Keywords: multiplier less technique, linear phase symmetric FIR filter, FPGA tool, look up table

Procedia PDF Downloads 385
1597 The Impact of Cognitive Load on Deceit Detection and Memory Recall in Children’s Interviews: A Meta-Analysis

Authors: Sevilay Çankaya

Abstract:

The detection of deception in children’s interviews is essential for statement veracity. The widely used method for deception detection is building cognitive load, which is the logic of the cognitive interview (CI), and its effectiveness for adults is approved. This meta-analysis delves into the effectiveness of inducing cognitive load as a means of enhancing veracity detection during interviews with children. Additionally, the effectiveness of cognitive load on children's total number of events recalled is assessed as a second part of the analysis. The current meta-analysis includes ten effect sizes from search using databases. For the effect size calculation, Hedge’s g was used with a random effect model by using CMA version 2. Heterogeneity analysis was conducted to detect potential moderators. The overall result indicated that cognitive load had no significant effect on veracity outcomes (g =0.052, 95% CI [-.006,1.25]). However, a high level of heterogeneity was found (I² = 92%). Age, participants’ characteristics, interview setting, and characteristics of the interviewer were coded as possible moderators to explain variance. Age was significant moderator (β = .021; p = .03, R2 = 75%) but the analysis did not reveal statistically significant effects for other potential moderators: participants’ characteristics (Q = 0.106, df = 1, p = .744), interview setting (Q = 2.04, df = 1, p = .154), and characteristics of interviewer (Q = 2.96, df = 1, p = .086). For the second outcome, the total number of events recalled, the overall effect was significant (g =4.121, 95% CI [2.256,5.985]). The cognitive load was effective in total recalled events when interviewing with children. All in all, while age plays a crucial role in determining the impact of cognitive load on veracity, the surrounding context, interviewer attributes, and inherent participant traits may not significantly alter the relationship. These findings throw light on the need for more focused, age-specific methods when using cognitive load measures. It may be possible to improve the precision and dependability of deceit detection in children's interviews with the help of more studies in this field.

Keywords: deceit detection, cognitive load, memory recall, children interviews, meta-analysis

Procedia PDF Downloads 51
1596 Protocol for Dynamic Load Distributed Low Latency Web-Based Augmented Reality and Virtual Reality

Authors: Rohit T. P., Sahil Athrij, Sasi Gopalan

Abstract:

Currently, the content entertainment industry is dominated by mobile devices. As the trends slowly shift towards Augmented/Virtual Reality applications the computational demands on these devices are increasing exponentially and we are already reaching the limits of hardware optimizations. This paper proposes a software solution to this problem. By leveraging the capabilities of cloud computing we can offload the work from mobile devices to dedicated rendering servers that are way more powerful. But this introduces the problem of latency. This paper introduces a protocol that can achieve high-performance low latency Augmented/Virtual Reality experience. There are two parts to the protocol, 1) In-flight compression The main cause of latency in the system is the time required to transmit the camera frame from client to server. The round trip time is directly proportional to the amount of data transmitted. This can therefore be reduced by compressing the frames before sending. Using some standard compression algorithms like JPEG can result in minor size reduction only. Since the images to be compressed are consecutive camera frames there won't be a lot of changes between two consecutive images. So inter-frame compression is preferred. Inter-frame compression can be implemented efficiently using WebGL but the implementation of WebGL limits the precision of floating point numbers to 16bit in most devices. This can introduce noise to the image due to rounding errors, which will add up eventually. This can be solved using an improved interframe compression algorithm. The algorithm detects changes between frames and reuses unchanged pixels from the previous frame. This eliminates the need for floating point subtraction thereby cutting down on noise. The change detection is also improved drastically by taking the weighted average difference of pixels instead of the absolute difference. The kernel weights for this comparison can be fine-tuned to match the type of image to be compressed. 2) Dynamic Load distribution Conventional cloud computing architectures work by offloading as much work as possible to the servers, but this approach can cause a hit on bandwidth and server costs. The most optimal solution is obtained when the device utilizes 100% of its resources and the rest is done by the server. The protocol balances the load between the server and the client by doing a fraction of the computing on the device depending on the power of the device and network conditions. The protocol will be responsible for dynamically partitioning the tasks. Special flags will be used to communicate the workload fraction between the client and the server and will be updated in a constant interval of time ( or frames ). The whole of the protocol is designed so that it can be client agnostic. Flags are available to the client for resetting the frame, indicating latency, switching mode, etc. The server can react to client-side changes on the fly and adapt accordingly by switching to different pipelines. The server is designed to effectively spread the load and thereby scale horizontally. This is achieved by isolating client connections into different processes.

Keywords: 2D kernelling, augmented reality, cloud computing, dynamic load distribution, immersive experience, mobile computing, motion tracking, protocols, real-time systems, web-based augmented reality application

Procedia PDF Downloads 68
1595 Application of Additive Manufacturing for Production of Optimum Topologies

Authors: Mahdi Mottahedi, Peter Zahn, Armin Lechler, Alexander Verl

Abstract:

Optimal topology of components leads to the maximum stiffness with the minimum material use. For the generation of these topologies, normally algorithms are employed, which tackle manufacturing limitations, at the cost of the optimal result. The global optimum result with penalty factor one, however, cannot be fabricated with conventional methods. In this article, an additive manufacturing method is introduced, in order to enable the production of global topology optimization results. For a benchmark, topology optimization with higher and lower penalty factors are performed. Different algorithms are employed in order to interpret the results of topology optimization with lower factors in many microstructure layers. These layers are then joined to form the final geometry. The algorithms’ benefits are then compared experimentally and numerically for the best interpretation. The findings demonstrate that by implementation of the selected algorithm, the stiffness of the components produced with this method is higher than what could have been produced by conventional techniques.

Keywords: topology optimization, additive manufacturing, 3D-printer, laminated object manufacturing

Procedia PDF Downloads 334
1594 Application of Artificial Neural Network for Prediction of Load-Haul-Dump Machine Performance Characteristics

Authors: J. Balaraju, M. Govinda Raj, C. S. N. Murthy

Abstract:

Every industry is constantly looking for enhancement of its day to day production and productivity. This can be possible only by maintaining the men and machinery at its adequate level. Prediction of performance characteristics plays an important role in performance evaluation of the equipment. Analytical and statistical approaches will take a bit more time to solve complex problems such as performance estimations as compared with software-based approaches. Keeping this in view the present study deals with an Artificial Neural Network (ANN) modelling of a Load-Haul-Dump (LHD) machine to predict the performance characteristics such as reliability, availability and preventive maintenance (PM). A feed-forward-back-propagation ANN technique has been used to model the Levenberg-Marquardt (LM) training algorithm. The performance characteristics were computed using Isograph Reliability Workbench 13.0 software. These computed values were validated using predicted output responses of ANN models. Further, recommendations are given to the industry based on the performed analysis for improvement of equipment performance.

Keywords: load-haul-dump, LHD, artificial neural network, ANN, performance, reliability, availability, preventive maintenance

Procedia PDF Downloads 141
1593 New Estimation in Autoregressive Models with Exponential White Noise by Using Reversible Jump MCMC Algorithm

Authors: Suparman Suparman

Abstract:

A white noise in autoregressive (AR) model is often assumed to be normally distributed. In application, the white noise usually do not follows a normal distribution. This paper aims to estimate a parameter of AR model that has a exponential white noise. A Bayesian method is adopted. A prior distribution of the parameter of AR model is selected and then this prior distribution is combined with a likelihood function of data to get a posterior distribution. Based on this posterior distribution, a Bayesian estimator for the parameter of AR model is estimated. Because the order of AR model is considered a parameter, this Bayesian estimator cannot be explicitly calculated. To resolve this problem, a method of reversible jump Markov Chain Monte Carlo (MCMC) is adopted. A result is a estimation of the parameter AR model can be simultaneously calculated.

Keywords: autoregressive (AR) model, exponential white Noise, bayesian, reversible jump Markov Chain Monte Carlo (MCMC)

Procedia PDF Downloads 351
1592 The Impact of Rising Architectural Façade in Improving Terms of the Physical Urban Ambience Inside the Free Space for Urban Fabric - the Street- Case Study the City of Biskra

Authors: Rami Qaoud, Alkama Djamal

Abstract:

When we ask about the impact of rising architectural façade in improving the terms physical urban ambiance inside the free space for urban fabric. Considered as bringing back life and culture values and civilization to these cities. And This will be the theme of this search. Where we have conducted the study about the relationship that connects the empty and full of in the urban fabric in terms of the density construction and the architectural elevation of its façade to street view. In this framework, we adopted in the methodology of this research the technical field experience. And according to three types of Street engineering(H≥2W, H=W, H≤0.5W). Where we conducted a field to raise the values of the physical ambiance according to three main axes of ambiance. The first axe 1 - Thermal ambiance. Where the temperature values were collected, relative humidity, wind speed, temperature of surfaces (the outer wall-ground). The second axe 2- Visual ambiance. Where we took the values of natural lighting levels during the daytime. The third axe 3- Acoustic ambiance . Where we take sound values during the entire day. That experience, which lasted for three consecutive days, and through six stations of measuring, where it has been one measuring station for each type of the street engineering and in two different way street. Through the obtained results and with the comparison of those values. We noticed the difference between this values and the three type of street engineering. Where the difference the calorific values of air equal 4 ° C , in terms of the visual ambiance the difference in the direct lighting natural periods amounted six hours between the three types of street engineering. As well in terms of sound ambience, registered a difference in values of up 15 (db) between the three types. This difference in values indicates The impact of rising architectural façade in improving the physical urban ambiance within the free field - street- for urban fabric.

Keywords: street, physical urban ambience, rising architectural façade, urban fabric

Procedia PDF Downloads 284
1591 Phytochemical Composition and Characterization of Bioactive Compounds of the Green Seaweed Ulva lactuca: A Phytotherapeutic Approach

Authors: Mariame Taibi, Marouane Aouiji, Rachid Bengueddour

Abstract:

The Moroccan coastline is particularly rich in algae and constitutes a reserve of species with considerable economic, social and ecological potential. This work focuses on the research and characterization of algae bioactive compounds that can be used in pharmacology or phytopathology. The biochemical composition of the green alga Ulva lactuca (Ulvophyceae) was studied by determining the content of moisture, ash, phenols, flavonoids, total tannins, and chlorophyll. Seven solvents: distilled water, methanol, ethyl acetate, chloroform, benzene, petroleum ether, and hexane, were tested for their effectiveness in recovering chemical compounds. The identification of functional groupings, as well as the bioactive chemical compounds, was determined by FT-IR and GC-MS. The moisture content of the alga was 77%, while the ash content was 15%. Phenol content differed from one solvent studied to another, while chlorophyll a, b, and total chlorophyll were determined at 14%, 9.52%, and 25%, respectively. Carotenoid was present in a considerable amount (8.17%). The experimental results show that methanol is the most effective solvent for recovering bioactive compounds, followed by water. Moreover, the green alga Ulva lactuca is characterized by a high level of total polyphenols (45±3.24 mg GAE/gDM), average levels of total tannins and flavonoids (22.52±8.23 mg CE/gDM, 15.49±0.064 mg QE/gDM) respectively. The results of Fourier transform infrared spectroscopy (FT-IR) confirmed the presence of alcohol/phenol and amide functions in Ulva lactuca. The GC-MS analysis gave precisely the compounds contained in the various extracts, such as phenolic compounds, fatty acids, terpenoids, alcohols, alkanes, hydrocarbons, and steroids. All these results represent only a first step in the search for biologically active natural substances from seaweed. Additional tests are envisaged to confirm the bioactivity of seaweed.

Keywords: algae, Ulva lactuca, phenolic compounds, FTIR, GC-MS

Procedia PDF Downloads 98
1590 Design of a Surveillance Drone with Computer Aided Durability

Authors: Maram Shahad Dana Anfal

Abstract:

This research paper presents the design of a surveillance drone with computer-aided durability and model analyses that provides a cost-effective and efficient solution for various applications. The quadcopter's design is based on a lightweight and strong structure made of materials such as aluminum and titanium, which provide a durable structure for the quadcopter. The structure of this product and the computer-aided durability system are both designed to ensure frequent repairs or replacements, which will save time and money in the long run. Moreover, the study discusses the drone's ability to track, investigate, and deliver objects more quickly than traditional methods, makes it a highly efficient and cost-effective technology. In this paper, a comprehensive analysis of the quadcopter's operation dynamics and limitations is presented. In both simulation and experimental data, the computer-aided durability system and the drone's design demonstrate their effectiveness, highlighting the potential for a variety of applications, such as search and rescue missions, infrastructure monitoring, and agricultural operations. Also, the findings provide insights into possible areas for improvement in the design and operation of the drone. Ultimately, this paper presents a reliable and cost-effective solution for surveillance applications by designing a drone with computer-aided durability and modeling. With its potential to save time and money, increase reliability, and enhance safety, it is a promising technology for the future of surveillance drones. operation dynamic equations have been evaluated successfully for different flight conditions of a quadcopter. Also, CAE modeling techniques have been applied for the modal risk assessment at operating conditions.Stress analysis have been performed under the loadings of the worst-case combined motion flight conditions.

Keywords: drone, material, solidwork, hypermesh

Procedia PDF Downloads 128
1589 Clinical Outcomes After Radiological Management of Varicoceles

Authors: Eric Lai, Sarah Lorger, David Eisinger, Richard Waugh

Abstract:

Introduction: Percutaneous embolization of varicoceles has shown similar outcomes to surgery. However, there are advantages of radiological intervention as patients are not exposed to general anaesthesia, experience a quicker recovery and face a lower risk of major complications. Radiological interventions are also preferable after a failed surgical approach. We evaluate clinical outcomes of percutaneous embolization at a tertiary hospital in Sydney, Australia. Methods: Retrospective case series without a control group from a single site (Royal Prince Alfred Hospital, Sydney, Australia). A data search was performed on the interventional radiology database with the word “varicocele” between February 2017 and March 2022. 62 patients were identified. Each patient file was reviewed and included in the study if they met the inclusion criteria. Results: A total of 56 patients were included. 6 patients were excluded as they did not receive intervention after the initial diagnostic venography. Technical success was 100%. Complications were seen in 3 patients (5.3%). The complications included post-procedural pain and fever, venous perforation with no clinical adverse outcome, and a mild allergic reaction to contrast. Recurrence occurred in 3 patients (5.6%), all of whom received a successful second procedure. DISCUSSION: This study demonstrates comparable rates of technical success, complication rate and recurrence to other studies in the literature. When compared to surgical outcomes, the results were also similar. The main limitation is multiple patients lack long-term follow-up beyond 1 year, resulting in potential underestimation of the recurrence rate. Conclusion: Percutaneous embolization of varicocele is a safe alternative to surgical intervention.

Keywords: varicocele, interventional radiology, urology, radiology

Procedia PDF Downloads 58
1588 The Application of Artificial Neural Networks for the Performance Prediction of Evacuated Tube Solar Air Collector with Phase Change Material

Authors: Sukhbir Singh

Abstract:

This paper describes the modeling of novel solar air collector (NSAC) system by using artificial neural network (ANN) model. The objective of the study is to demonstrate the application of the ANN model to predict the performance of the NSAC with acetamide as a phase change material (PCM) storage. Input data set consist of time, solar intensity and ambient temperature wherever as outlet air temperature of NSAC was considered as output. Experiments were conducted between 9.00 and 24.00 h in June and July 2014 underneath the prevailing atmospheric condition of Kurukshetra (city of the India). After that, experimental results were utilized to train the back propagation neural network (BPNN) to predict the outlet air temperature of NSAC. The results of proposed algorithm show that the BPNN is effective tool for the prediction of responses. The BPNN predicted results are 99% in agreement with the experimental results.

Keywords: Evacuated tube solar air collector, Artificial neural network, Phase change material, solar air collector

Procedia PDF Downloads 115
1587 Research on Health Emergency Management Based on the Bibliometrics

Authors: Meng-Na Dai, Bao-Fang Wen, Gao-Pei Zhu, Chen-Xi Zhang, Jing Sun, Chang-Hai Tang, Zhi-Qiang Feng, Wen-Qiang Yin

Abstract:

Based on the analysis of literature in the health emergency management in China with recent 10 years, this paper discusses the Chinese current research hotspots, development trends and shortcomings in this field, and provides references for scholars to conduct follow-up research. CNKI(China National Knowledge Infrastructure), Weipu, and Wanfang were the databases of this literature. The key words during the database search were health, emergency, and management with the time from 2009 to 2018. The duplicate, non-academic, and unrelated documents were excluded. 901 articles were included in the literature review database. The main indicators of abstraction were, the number of articles published every year, authors, institutions, periodicals, etc. There are some research findings through the analysis of the literature. Overall, the number of literature in the health emergency management in China has shown a fluctuating downward trend in recent 10 years. Specifically, there is a lack of close cooperation between authors, which has not constituted the core team among them yet. Meanwhile, in this field, the number of high-level periodicals and quality literature is scarce. In addition, there are a lot of research hotspots, such as emergency management system, mechanism research, capacity evaluation index system research, plans and capacity-building research, etc. In the future, we should increase the scientific research funding of the health emergency management, encourage collaborative innovation among authors in multi-disciplinary fields, and create high-quality and high-impact journals in this field. The states should encourage scholars in this field to carry out more academic cooperation and communication with the whole world and improve the research in breadth and depth. Generally speaking, the research in health emergency management in China is still insufficient and needs to be improved.

Keywords: health emergency management, research situation, bibliometrics, literature

Procedia PDF Downloads 129
1586 Calculation of Organ Dose for Adult and Pediatric Patients Undergoing Computed Tomography Examinations: A Software Comparison

Authors: Aya Al Masri, Naima Oubenali, Safoin Aktaou, Thibault Julien, Malorie Martin, Fouad Maaloul

Abstract:

Introduction: The increased number of performed 'Computed Tomography (CT)' examinations raise public concerns regarding associated stochastic risk to patients. In its Publication 102, the ‘International Commission on Radiological Protection (ICRP)’ emphasized the importance of managing patient dose, particularly from repeated or multiple examinations. We developed a Dose Archiving and Communication System that gives multiple dose indexes (organ dose, effective dose, and skin-dose mapping) for patients undergoing radiological imaging exams. The aim of this study is to compare the organ dose values given by our software for patients undergoing CT exams with those of another software named "VirtualDose". Materials and methods: Our software uses Monte Carlo simulations to calculate organ doses for patients undergoing computed tomography examinations. The general calculation principle consists to simulate: (1) the scanner machine with all its technical specifications and associated irradiation cases (kVp, field collimation, mAs, pitch ...) (2) detailed geometric and compositional information of dozens of well identified organs of computational hybrid phantoms that contain the necessary anatomical data. The mass as well as the elemental composition of the tissues and organs that constitute our phantoms correspond to the recommendations of the international organizations (namely the ICRP and the ICRU). Their body dimensions correspond to reference data developed in the United States. Simulated data was verified by clinical measurement. To perform the comparison, 270 adult patients and 150 pediatric patients were used, whose data corresponds to exams carried out in France hospital centers. The comparison dataset of adult patients includes adult males and females for three different scanner machines and three different acquisition protocols (Head, Chest, and Chest-Abdomen-Pelvis). The comparison sample of pediatric patients includes the exams of thirty patients for each of the following age groups: new born, 1-2 years, 3-7 years, 8-12 years, and 13-16 years. The comparison for pediatric patients were performed on the “Head” protocol. The percentage of the dose difference were calculated for organs receiving a significant dose according to the acquisition protocol (80% of the maximal dose). Results: Adult patients: for organs that are completely covered by the scan range, the maximum percentage of dose difference between the two software is 27 %. However, there are three organs situated at the edges of the scan range that show a slightly higher dose difference. Pediatric patients: the percentage of dose difference between the two software does not exceed 30%. These dose differences may be due to the use of two different generations of hybrid phantoms by the two software. Conclusion: This study shows that our software provides a reliable dosimetric information for patients undergoing Computed Tomography exams.

Keywords: adult and pediatric patients, computed tomography, organ dose calculation, software comparison

Procedia PDF Downloads 150
1585 A Neural Network Modelling Approach for Predicting Permeability from Well Logs Data

Authors: Chico Horacio Jose Sambo

Abstract:

Recently neural network has gained popularity when come to solve complex nonlinear problems. Permeability is one of fundamental reservoir characteristics system that are anisotropic distributed and non-linear manner. For this reason, permeability prediction from well log data is well suited by using neural networks and other computer-based techniques. The main goal of this paper is to predict reservoir permeability from well logs data by using neural network approach. A multi-layered perceptron trained by back propagation algorithm was used to build the predictive model. The performance of the model on net results was measured by correlation coefficient. The correlation coefficient from testing, training, validation and all data sets was evaluated. The results show that neural network was capable of reproducing permeability with accuracy in all cases, so that the calculated correlation coefficients for training, testing and validation permeability were 0.96273, 0.89991 and 0.87858, respectively. The generalization of the results to other field can be made after examining new data, and a regional study might be possible to study reservoir properties with cheap and very fast constructed models.

Keywords: neural network, permeability, multilayer perceptron, well log

Procedia PDF Downloads 396
1584 Drippers Scaling Inhibition of the Localized Irrigation System by Green Inhibitors Based on Plant Extracts

Authors: Driouiche Ali, Karmal Ilham

Abstract:

The Agadir region is characterized by a dry climate, ranging from arid attenuated by oceanic influences to hyper-arid. The water mobilized in the agricultural sector of greater Agadir is 95% of underground origin and comes from the water table of Chtouka. The rest represents the surface waters of the Youssef Ben Tachfine dam. These waters are intended for the irrigation of 26880 hectares of modern agriculture. More than 120 boreholes and wells are currently exploited. Their depth varies between 10 m and 200 m and the unit flow rates of the boreholes are 5 to 50 l/s. A drop in the level of the water table of about 1.5 m/year, on average, has been observed during the last five years. Farmers are thus called upon to improve irrigation methods. Thus, localized or drip irrigation is adopted to allow rational use of water. The importance of this irrigation system is due to the fact that water is applied directly to the root zone and its compatibility with fertilization. However, this irrigation system faces a thorny problem which is the clogging of pipes and drippers. This leads to a lack of uniformity of irrigation over time. This so-called scaling phenomenon, the consequences of which are harmful (cleaning or replacement of pipes), leads to considerable unproductive expenditure. The objective set by this work is the search for green inhibitors likely to prevent this phenomenon of scaling. This study requires a better knowledge of these waters, their physico-chemical characteristics and their scaling power. Thus, using the "LCGE" controlled degassing technique, we initially evaluated, on pure calco-carbonic water at 30°F, the scaling-inhibiting power of some available plant extracts in our region of Souss-Massa. We then carried out a comparative study of the efficacy of these green inhibitors. The action of the most effective green inhibitor on real agricultural waters was then studied.

Keywords: green inhibitors, localized irrigation, plant extracts, scaling inhibition

Procedia PDF Downloads 76
1583 Multiclass Support Vector Machines with Simultaneous Multi-Factors Optimization for Corporate Credit Ratings

Authors: Hyunchul Ahn, William X. S. Wong

Abstract:

Corporate credit rating prediction is one of the most important topics, which has been studied by researchers in the last decade. Over the last decade, researchers are pushing the limit to enhance the exactness of the corporate credit rating prediction model by applying several data-driven tools including statistical and artificial intelligence methods. Among them, multiclass support vector machine (MSVM) has been widely applied due to its good predictability. However, heuristics, for example, parameters of a kernel function, appropriate feature and instance subset, has become the main reason for the critics on MSVM, as they have dictate the MSVM architectural variables. This study presents a hybrid MSVM model that is intended to optimize all the parameter such as feature selection, instance selection, and kernel parameter. Our model adopts genetic algorithm (GA) to simultaneously optimize multiple heterogeneous design factors of MSVM.

Keywords: corporate credit rating prediction, Feature selection, genetic algorithms, instance selection, multiclass support vector machines

Procedia PDF Downloads 289
1582 Passive Vibration Isolation Analysis and Optimization for Mechanical Systems

Authors: Ozan Yavuz Baytemir, Ender Cigeroglu, Gokhan Osman Ozgen

Abstract:

Vibration is an important issue in the design of various components of aerospace, marine and vehicular applications. In order not to lose the components’ function and operational performance, vibration isolation design involving the optimum isolator properties selection and isolator positioning processes appear to be a critical study. Knowing the growing need for the vibration isolation system design, this paper aims to present two types of software capable of implementing modal analysis, response analysis for both random and harmonic types of excitations, static deflection analysis, Monte Carlo simulations in addition to study of parameter and location optimization for different types of isolation problem scenarios. Investigating the literature, there is no such study developing a software-based tool that is capable of implementing all those analysis, simulation and optimization studies in one platform simultaneously. In this paper, the theoretical system model is generated for a 6-DOF rigid body. The vibration isolation system of any mechanical structure is able to be optimized using hybrid method involving both global search and gradient-based methods. Defining the optimization design variables, different types of optimization scenarios are listed in detail. Being aware of the need for a user friendly vibration isolation problem solver, two types of graphical user interfaces (GUIs) are prepared and verified using a commercial finite element analysis program, Ansys Workbench 14.0. Using the analysis and optimization capabilities of those GUIs, a real application used in an air-platform is also presented as a case study at the end of the paper.

Keywords: hybrid optimization, Monte Carlo simulation, multi-degree-of-freedom system, parameter optimization, location optimization, passive vibration isolation analysis

Procedia PDF Downloads 556
1581 Dual-Channel Reliable Breast Ultrasound Image Classification Based on Explainable Attribution and Uncertainty Quantification

Authors: Haonan Hu, Shuge Lei, Dasheng Sun, Huabin Zhang, Kehong Yuan, Jian Dai, Jijun Tang

Abstract:

This paper focuses on the classification task of breast ultrasound images and conducts research on the reliability measurement of classification results. A dual-channel evaluation framework was developed based on the proposed inference reliability and predictive reliability scores. For the inference reliability evaluation, human-aligned and doctor-agreed inference rationals based on the improved feature attribution algorithm SP-RISA are gracefully applied. Uncertainty quantification is used to evaluate the predictive reliability via the test time enhancement. The effectiveness of this reliability evaluation framework has been verified on the breast ultrasound clinical dataset YBUS, and its robustness is verified on the public dataset BUSI. The expected calibration errors on both datasets are significantly lower than traditional evaluation methods, which proves the effectiveness of the proposed reliability measurement.

Keywords: medical imaging, ultrasound imaging, XAI, uncertainty measurement, trustworthy AI

Procedia PDF Downloads 84
1580 Benefits of Occupational Therapy for Children with Intellectual Disabilities in the Aspects of Vocational Activities and Instrumental Activities of Daily Life

Authors: Shakhawath Hossain, Tazkia Tahsin

Abstract:

Introduction/Background: Intellectual disability is a disability characterized by significant limitations both in intellectual functioning and in adaptive behavior, which covers many everyday social and practical skills. Vocational education is a multi-professional approach that is provided to individuals of working age with health-related impairments, limitations, or restrictions with work functioning and whose primary aim is to optimize work participation. Instrumental Activities of Daily Living activities to support daily life within the home and community. Like as community mobility, financial management, meal preparation, and clean-up, shopping. Material and Method: Electronic searches of Medline, PubMed, Google scholar, OT Seeker literature using the key terms of intellectual disability, vocational rehabilitation, instrumental activities of daily living and Occupational Therapy, as well as a thorough manual search for relevant literature. Results: There were 13 articles, all qualitative and quantitative, which are included in this review. All studies were mixed methods in design. To take the Occupational Therapy services, there is a significant improvement in their children's various areas like as sensory issues, cognitive abilities, perceptual skills, visual, motor planning, and group therapy. After taking the vocational and instrumental activities of daily living training children with intellectual disabilities to participate in their daily activities and work as an employee different company or organizations. Conclusion: The persons with intellectual disability are an integral part of our society who deserves social support and opportunities like other human beings. From the result section of the project papers, it is found that the significant benefits of Occupational Therapy services in the aspects of vocational and instrumental activities of daily living.

Keywords: occupational therapy, daily living activities, intellectual disabilities, instrumental ADL

Procedia PDF Downloads 126
1579 Design of Nano-Reinforced Carbon Fiber Reinforced Plastic Wheel for Lightweight Vehicles with Integrated Electrical Hub Motor

Authors: Davide Cocchi, Andrea Zucchelli, Luca Raimondi, Maria Brugo Tommaso

Abstract:

The increasing attention is given to the issues of environmental pollution and climate change is exponentially stimulating the development of electrically propelled vehicles powered by renewable energy, in particular, the solar one. Given the small amount of solar energy that can be stored and subsequently transformed into propulsive energy, it is necessary to develop vehicles with high mechanical, electrical and aerodynamic efficiencies along with reduced masses. The reduction of the masses is of fundamental relevance especially for the unsprung masses, that is the assembly of those elements that do not undergo a variation of their distance from the ground (wheel, suspension system, hub, upright, braking system). Therefore, the reduction of unsprung masses is fundamental in decreasing the rolling inertia and improving the drivability, comfort, and performance of the vehicle. This principle applies even more in solar propelled vehicles, equipped with an electric motor that is connected directly to the wheel hub. In this solution, the electric motor is integrated inside the wheel. Since the electric motor is part of the unsprung masses, the development of compact and lightweight solutions is of fundamental importance. The purpose of this research is the design development and optimization of a CFRP 16 wheel hub motor for solar propulsion vehicles that can carry up to four people. In addition to trying to maximize aspects of primary importance such as mass, strength, and stiffness, other innovative constructive aspects were explored. One of the main objectives has been to achieve a high geometric packing in order to ensure a reduced lateral dimension, without reducing the power exerted by the electric motor. In the final solution, it was possible to realize a wheel hub motor assembly completely comprised inside the rim width, for a total lateral overall dimension of less than 100 mm. This result was achieved by developing an innovative connection system between the wheel and the rotor with a double purpose: centering and transmission of the driving torque. This solution with appropriate interlocking noses allows the transfer of high torques and at the same time guarantees both the centering and the necessary stiffness of the transmission system. Moreover, to avoid delamination in critical areas, evaluated by means of FEM analysis using 3D Hashin damage criteria, electrospun nanofibrous mats have been interleaved between CFRP critical layers. In order to reduce rolling resistance, the rim has been designed to withstand high inflation pressure. Laboratory tests have been performed on the rim using the Digital Image Correlation technique (DIC). The wheel has been tested for fatigue bending according to E/ECE/324 R124e.

Keywords: composite laminate, delamination, DIC, lightweight vehicle, motor hub wheel, nanofiber

Procedia PDF Downloads 207
1578 Nectariferous Plant Genetic Resources for Apicultural Entrepreneurship in Nigeria: Prerequisite for Conservation, Sustainable Management and Policy

Authors: C. V. Nnamani, O. L. Adedeji

Abstract:

The contemporary global economic meltdown has devastating effect on the Nigerian’s economy and its frantic search for alternative source of national revenue aside from oil and gas has become imperative for economic emancipation for Nigerians. Apicultural entrepreneurship could provide a source of livelihood if the basic knowledge of those plant genetic resources needed by bees is made available. A palynological evaluation of those palynotaxa which honey bees forage for pollen and nectar was carried out after standard acetolysis method. Results showed that the honey samples were highly diversified and rich in honey plants. A total of 9544.3 honey pollen, consisting of 39 honey plants belonging to 21 plant families and distributed within 38 genera were identified excluding 238 unidentified pollen grains. Data from the analysis equally revealed that Elaeis guineensis Jacq, Anacardium occidentale L, Diospyros mespiliformis Hochist xe ADC, Alchornea cordifolia Muell, Arg, Daniella oliveri (Rolfe) Hutch & Dalz, Irvingia wombolu Okafor ex Baill, Treculia africana Decne, Nauclea latifolia Smith and Crossopteryx febrifuga Afzil ex Benth were the predominant honey plants. It provided a guide to the optimal utilization of floral resources by honeybees in these regions, showing the opportunity and amazing potentials for apiculture entrepreneurship of these palytaxa. Most of these plants are rare, threatened and endangered. It calls for urgent conservation techniques and step by all players. Critical awareness creation to ensure farmers knowledge of these palynotaxa to ensure proper understanding and attendance boost from them as economic empowerment is needed.

Keywords: palynotaxa, acetolysis, enterprise, livelihood, Nigeria

Procedia PDF Downloads 286
1577 Velocity Logs Error Reduction for In-Service Calibration of Vessel Performance Indicators

Authors: Maria Tsompanoglou, Dimitris Armenis

Abstract:

Vessel behavior in different operational and weather conditions constitutes the main area of interest for the ship operator. Ship speed and fuel consumption are the most decisive parameters in this respect, as their correlation provides information about the economic and environmental efficiency of the vessel, becoming the basis of decision making in terms of maintenance and trading. In the analysis of vessel operational profile for the evaluation of fuel consumption and the equivalent CO2 emissions footprint, the indications of Speed Through Water are widely used. The seasonal and regional variations in seawater characteristics, which are available nowadays, can provide the basis for accurate estimation of the errors in Speed Through Water indications at any time. Accuracy in the speed value on a route basis can enable operator identify the ship fuel and propulsion efficiency and proceed with improvements. This paper discusses case studies, where the actual vessel speed was corrected by a post-processing algorithm. The effects of the vessel correction to standard Key Performance Indicators, as well as operational findings not identified earlier, are also discussed.

Keywords: data analytics, MATLAB, vessel performance monitoring, speed through water

Procedia PDF Downloads 293
1576 Research on Knowledge Graph Inference Technology Based on Proximal Policy Optimization

Authors: Yihao Kuang, Bowen Ding

Abstract:

With the increasing scale and complexity of knowledge graph, modern knowledge graph contains more and more types of entity, relationship, and attribute information. Therefore, in recent years, it has been a trend for knowledge graph inference to use reinforcement learning to deal with large-scale, incomplete, and noisy knowledge graph and improve the inference effect and interpretability. The Proximal Policy Optimization (PPO) algorithm utilizes a near-end strategy optimization approach. This allows for more extensive updates of policy parameters while constraining the update extent to maintain training stability. This characteristic enables PPOs to converge to improve strategies more rapidly, often demonstrating enhanced performance early in the training process. Furthermore, PPO has the advantage of offline learning, effectively utilizing historical experience data for training and enhancing sample utilization. This means that even with limited resources, PPOs can efficiently train for reinforcement learning tasks. Based on these characteristics, this paper aims to obtain better and more efficient inference effect by introducing PPO into knowledge inference technology.

Keywords: reinforcement learning, PPO, knowledge inference, supervised learning

Procedia PDF Downloads 55
1575 Decision Making in Medicine and Treatment Strategies

Authors: Kamran Yazdanbakhsh, Somayeh Mahmoudi

Abstract:

Three reasons make good use of the decision theory in medicine: 1. Increased medical knowledge and their complexity makes it difficult treatment information effectively without resorting to sophisticated analytical methods, especially when it comes to detecting errors and identify opportunities for treatment from databases of large size. 2. There is a wide geographic variability of medical practice. In a context where medical costs are, at least in part, by the patient, these changes raise doubts about the relevance of the choices made by physicians. These differences are generally attributed to differences in estimates of probabilities of success of treatment involved, and differing assessments of the results on success or failure. Without explicit criteria for decision, it is difficult to identify precisely the sources of these variations in treatment. 3. Beyond the principle of informed consent, patients need to be involved in decision-making. For this, the decision process should be explained and broken down. A decision problem is to select the best option among a set of choices. The problem is what is meant by "best option ", or know what criteria guide the choice. The purpose of decision theory is to answer this question. The systematic use of decision models allows us to better understand the differences in medical practices, and facilitates the search for consensus. About this, there are three types of situations: situations certain, risky situations, and uncertain situations: 1. In certain situations, the consequence of each decision are certain. 2. In risky situations, every decision can have several consequences, the probability of each of these consequences is known. 3. In uncertain situations, each decision can have several consequences, the probability is not known. Our aim in this article is to show how decision theory can usefully be mobilized to meet the needs of physicians. The decision theory can make decisions more transparent: first, by clarifying the data systematically considered the problem and secondly by asking a few basic principles should guide the choice. Once the problem and clarified the decision theory provides operational tools to represent the available information and determine patient preferences, and thus assist the patient and doctor in their choices.

Keywords: decision making, medicine, treatment strategies, patient

Procedia PDF Downloads 574
1574 Robust Optimisation Model and Simulation-Particle Swarm Optimisation Approach for Vehicle Routing Problem with Stochastic Demands

Authors: Mohanad Al-Behadili, Djamila Ouelhadj

Abstract:

In this paper, a specific type of vehicle routing problem under stochastic demand (SVRP) is considered. This problem is of great importance because it models for many of the real world vehicle routing applications. This paper used a robust optimisation model to solve the problem along with the novel Simulation-Particle Swarm Optimisation (Sim-PSO) approach. The proposed Sim-PSO approach is based on the hybridization of the Monte Carlo simulation technique with the PSO algorithm. A comparative study between the proposed model and the Sim-PSO approach against other solution methods in the literature has been given in this paper. This comparison including the Analysis of Variance (ANOVA) to show the ability of the model and solution method in solving the complicated SVRP. The experimental results show that the proposed model and Sim-PSO approach has a significant impact on the obtained solution by providing better quality solutions comparing with well-known algorithms in the literature.

Keywords: stochastic vehicle routing problem, robust optimisation model, Monte Carlo simulation, particle swarm optimisation

Procedia PDF Downloads 269
1573 Theoretical and Experimental Investigations of Binary Systems for Hydrogen Storage

Authors: Gauthier Lefevre, Holger Kohlmann, Sebastien Saitzek, Rachel Desfeux, Adlane Sayede

Abstract:

Hydrogen is a promising energy carrier, compatible with the sustainable energy concept. In this context, solid-state hydrogen-storage is the key challenge in developing hydrogen economy. The capability of absorption of large quantities of hydrogen makes intermetallic systems of particular interest. In this study, efforts have been devoted to the theoretical investigation of binary systems with constraints consideration. On the one hand, besides considering hydrogen-storage, a reinvestigation of crystal structures of the palladium-arsenic system shows, with experimental validations, that binary systems could still currently present new or unknown relevant structures. On the other hand, various binary Mg-based systems were theoretically scrutinized in order to find new interesting alloys for hydrogen storage. Taking the effect of pressure into account reveals a wide range of alternative structures, changing radically the stable compounds of studied binary systems. Similar constraints, induced by Pulsed Laser Deposition, have been applied to binary systems, and results are presented.

Keywords: binary systems, evolutionary algorithm, first principles study, pulsed laser deposition

Procedia PDF Downloads 263
1572 A Deterministic Approach for Solving the Hull and White Interest Rate Model with Jump Process

Authors: Hong-Ming Chen

Abstract:

This work considers the resolution of the Hull and White interest rate model with the jump process. A deterministic process is adopted to model the random behavior of interest rate variation as deterministic perturbations, which is depending on the time t. The Brownian motion and jumps uncertainty are denoted as the integral functions piecewise constant function w(t) and point function θ(t). It shows that the interest rate function and the yield function of the Hull and White interest rate model with jump process can be obtained by solving a nonlinear semi-infinite programming problem. A relaxed cutting plane algorithm is then proposed for solving the resulting optimization problem. The method is calibrated for the U.S. treasury securities at 3-month data and is used to analyze several effects on interest rate prices, including interest rate variability, and the negative correlation between stock returns and interest rates. The numerical results illustrate that our approach essentially generates the yield functions with minimal fitting errors and small oscillation.

Keywords: optimization, interest rate model, jump process, deterministic

Procedia PDF Downloads 158