Search results for: elliptic curve digital signature algorithm
2068 Application of Artificial Neural Network for Prediction of High Tensile Steel Strands in Post-Tensioned Slabs
Authors: Gaurav Sancheti
Abstract:
This study presents an impacting approach of Artificial Neural Networks (ANNs) in determining the quantity of High Tensile Steel (HTS) strands required in post-tensioned (PT) slabs. Various PT slab configurations were generated by varying the span and depth of the slab. For each of these slab configurations, quantity of required HTS strands were recorded. ANNs with backpropagation algorithm and varying architectures were developed and their performance was evaluated in terms of Mean Square Error (MSE). The recorded data for the quantity of HTS strands was used as a feeder database for training the developed ANNs. The networks were validated using various validation techniques. The results show that the proposed ANNs have a great potential with good prediction and generalization capability.Keywords: artificial neural networks, back propagation, conceptual design, high tensile steel strands, post tensioned slabs, validation techniques
Procedia PDF Downloads 2212067 A Study on the Effect of Design Factors of Slim Keyboard’s Tactile Feedback
Authors: Kai-Chieh Lin, Chih-Fu Wu, Hsiang Ling Hsu, Yung-Hsiang Tu, Chia-Chen Wu
Abstract:
With the rapid development of computer technology, the design of computers and keyboards moves towards a trend of slimness. The change of mobile input devices directly influences users’ behavior. Although multi-touch applications allow entering texts through a virtual keyboard, the performance, feedback, and comfortableness of the technology is inferior to traditional keyboard, and while manufacturers launch mobile touch keyboards and projection keyboards, the performance has not been satisfying. Therefore, this study discussed the design factors of slim pressure-sensitive keyboards. The factors were evaluated with an objective (accuracy and speed) and a subjective evaluation (operability, recognition, feedback, and difficulty) depending on the shape (circle, rectangle, and L-shaped), thickness (flat, 3mm, and 6mm), and force (35±10g, 60±10g, and 85±10g) of the keyboard. Moreover, MANOVA and Taguchi methods (regarding signal-to-noise ratios) were conducted to find the optimal level of each design factor. The research participants, by their typing speed (30 words/ minute), were divided in two groups. Considering the multitude of variables and levels, the experiments were implemented using the fractional factorial design. A representative model of the research samples were established for input task testing. The findings of this study showed that participants with low typing speed primarily relied on vision to recognize the keys, and those with high typing speed relied on tactile feedback that was affected by the thickness and force of the keys. In the objective and subjective evaluation, a combination of keyboard design factors that might result in higher performance and satisfaction was identified (L-shaped, 3mm, and 60±10g) as the optimal combination. The learning curve was analyzed to make a comparison with a traditional standard keyboard to investigate the influence of user experience on keyboard operation. The research results indicated the optimal combination provided input performance to inferior to a standard keyboard. The results could serve as a reference for the development of related products in industry and for applying comprehensively to touch devices and input interfaces which are interacted with people.Keywords: input performance, mobile device, slim keyboard, tactile feedback
Procedia PDF Downloads 2992066 Framework for Socio-Technical Issues in Requirements Engineering for Developing Resilient Machine Vision Systems Using Levels of Automation through the Lifecycle
Authors: Ryan Messina, Mehedi Hasan
Abstract:
This research is to examine the impacts of using data to generate performance requirements for automation in visual inspections using machine vision. These situations are intended for design and how projects can smooth the transfer of tacit knowledge to using an algorithm. We have proposed a framework when specifying machine vision systems. This framework utilizes varying levels of automation as contingency planning to reduce data processing complexity. Using data assists in extracting tacit knowledge from those who can perform the manual tasks to assist design the system; this means that real data from the system is always referenced and minimizes errors between participating parties. We propose using three indicators to know if the project has a high risk of failing to meet requirements related to accuracy and reliability. All systems tested achieved a better integration into operations after applying the framework.Keywords: automation, contingency planning, continuous engineering, control theory, machine vision, system requirements, system thinking
Procedia PDF Downloads 2042065 Alternator Fault Detection Using Wigner-Ville Distribution
Authors: Amin Ranjbar, Amir Arsalan Jalili Zolfaghari, Amir Abolfazl Suratgar, Mehrdad Khajavi
Abstract:
This paper describes two stages of learning-based fault detection procedure in alternators. The procedure consists of three states of machine condition namely shortened brush, high impedance relay and maintaining a healthy condition in the alternator. The fault detection algorithm uses Wigner-Ville distribution as a feature extractor and also appropriate feature classifier. In this work, ANN (Artificial Neural Network) and also SVM (support vector machine) were compared to determine more suitable performance evaluated by the mean squared of errors criteria. Modules work together to detect possible faulty conditions of machines working. To test the method performance, a signal database is prepared by making different conditions on a laboratory setup. Therefore, it seems by implementing this method, satisfactory results are achieved.Keywords: alternator, artificial neural network, support vector machine, time-frequency analysis, Wigner-Ville distribution
Procedia PDF Downloads 3742064 Deformation Characteristics of Fire Damaged and Rehabilitated Normal Strength Concrete Beams
Authors: Yeo Kyeong Lee, Hae Won Min, Ji Yeon Kang, Hee Sun Kim, Yeong Soo Shin
Abstract:
Fire incidents have been steadily increased over the last year according to national emergency management agency of South Korea. Even though most of the fire incidents with property damage have been occurred in building, rehabilitation has not been properly done with consideration of structure safety. Therefore, this study aims at evaluating rehabilitation effects on fire damaged normal strength concrete beams through experiments and finite element analyses. For the experiments, reinforced concrete beams were fabricated having designed concrete strength of 21 MPa. Two different cover thicknesses were used as 40 mm and 50 mm. After cured, the fabricated beams were heated for 1hour or 2hours according to ISO-834 standard time-temperature curve. Rehabilitation was done by removing the damaged part of cover thickness and filling polymeric mortar into the removed part. Both fire damaged beams and rehabilitated beams were tested with four point loading system to observe structural behaviors and the rehabilitation effect. To verify the experiment, finite element (FE) models for structural analysis were generated using commercial software ABAQUS 6.10-3. For the rehabilitated beam models, integrated temperature-structural analyses were performed in advance to obtain geometries of the fire damaged beams. In addition to the fire damaged beam models, rehabilitated part was added with material properties of polymeric mortar. Three dimensional continuum brick elements were used for both temperature and structural analyses. The same loading and boundary conditions as experiments were implemented to the rehabilitated beam models and non-linear geometrical analyses were performed. Test results showed that maximum loads of the rehabilitated beams were 8~10% higher than those of the non-rehabilitated beams and even 1~6 % higher than those of the non-fire damaged beam. Stiffness of the rehabilitated beams were also larger than that of non-rehabilitated beams but smaller than that of the non-fire damaged beams. In addition, predicted structural behaviors from the analyses also showed good rehabilitation effect and the predicted load-deflection curves were similar to the experimental results. From this study, both experiments and analytical results demonstrated good rehabilitation effect on the fire damaged normal strength concrete beams. For the further, the proposed analytical method can be used to predict structural behaviors of rehabilitated and fire damaged concrete beams accurately without suffering from time and cost consuming experimental process.Keywords: fire, normal strength concrete, rehabilitation, reinforced concrete beam
Procedia PDF Downloads 5082063 Advanced Technologies for Detector Readout in Particle Physics
Authors: Y. Venturini, C. Tintori
Abstract:
Given the continuous demand for improved readout performances in particle and dark matter physics, CAEN SpA is pushing on the development of advanced technologies for detector readout. We present the Digitizers 2.0, the result of the success of the previous Digitizers generation, combined with expanded capabilities and a renovation of the user experience introducing the open FPGA. The first product of the family is the VX2740 (64 ch, 125 MS/s, 16 bit) for advanced waveform recording and Digital Pulse Processing, fitting with the special requirements of Dark Matter and Neutrino experiments. In parallel, CAEN is developing the FERS-5200 platform, a Front-End Readout System designed to read out large multi-detector arrays, such as SiPMs, multi-anode PMTs, silicon strip detectors, wire chambers, GEM, gas tubes, and others. This is a highly-scalable distributed platform, based on small Front-End cards synchronized and read out by a concentrator board, allowing to build extremely large experimental setup. We plan to develop a complete family of cost-effective Front-End cards tailored to specific detectors and applications. The first one available is the A5202, a 64-channel unit for SiPM readout based on CITIROC ASIC by Weeroc.Keywords: dark matter, digitizers, front-end electronics, open FPGA, SiPM
Procedia PDF Downloads 1282062 Design and Motion Control of a Two-Wheel Inverted Pendulum Robot
Authors: Shiuh-Jer Huang, Su-Shean Chen, Sheam-Chyun Lin
Abstract:
Two-wheel inverted pendulum robot (TWIPR) is designed with two-hub DC motors for human riding and motion control evaluation. In order to measure the tilt angle and angular velocity of the inverted pendulum robot, accelerometer and gyroscope sensors are chosen. The mobile robot’s moving position and velocity were estimated based on DC motor built in hall sensors. The control kernel of this electric mobile robot is designed with embedded Arduino Nano microprocessor. A handle bar was designed to work as steering mechanism. The intelligent model-free fuzzy sliding mode control (FSMC) was employed as the main control algorithm for this mobile robot motion monitoring with different control purpose adjustment. The intelligent controllers were designed for balance control, and moving speed control purposes of this robot under different operation conditions and the control performance were evaluated based on experimental results.Keywords: balance control, speed control, intelligent controller, two wheel inverted pendulum
Procedia PDF Downloads 2242061 Research on Development and Accuracy Improvement of an Explosion Proof Combustible Gas Leak Detector Using an IR Sensor
Authors: Gyoutae Park, Seungho Han, Byungduk Kim, Youngdo Jo, Yongsop Shim, Yeonjae Lee, Sangguk Ahn, Hiesik Kim, Jungil Park
Abstract:
In this paper, we presented not only development technology of an explosion proof type and portable combustible gas leak detector but also algorithm to improve accuracy for measuring gas concentrations. The presented techniques are to apply the flame-proof enclosure and intrinsic safe explosion proof to an infrared gas leak detector at first in Korea and to improve accuracy using linearization recursion equation and Lagrange interpolation polynomial. Together, we tested sensor characteristics and calibrated suitable input gases and output voltages. Then, we advanced the performances of combustible gaseous detectors through reflecting demands of gas safety management fields. To check performances of two company's detectors, we achieved the measurement tests with eight standard gases made by Korea Gas Safety Corporation. We demonstrated our instruments better in detecting accuracy other than detectors through experimental results.Keywords: accuracy improvement, IR gas sensor, gas leak, detector
Procedia PDF Downloads 3912060 A Review on Water Models of Surface Water Environment
Authors: Shahbaz G. Hassan
Abstract:
Water quality models are very important to predict the changes in surface water quality for environmental management. The aim of this paper is to give an overview of the water qualities, and to provide directions for selecting models in specific situation. Water quality models include one kind of model based on a mechanistic approach, while other models simulate water quality without considering a mechanism. Mechanistic models can be widely applied and have capabilities for long-time simulation, with highly complexity. Therefore, more spaces are provided to explain the principle and application experience of mechanistic models. Mechanism models have certain assumptions on rivers, lakes and estuaries, which limits the application range of the model, this paper introduces the principles and applications of water quality model based on the above three scenarios. On the other hand, mechanistic models are more easily to compute, and with no limit to the geographical conditions, but they cannot be used with confidence to simulate long term changes. This paper divides the empirical models into two broad categories according to the difference of mathematical algorithm, models based on artificial intelligence and models based on statistical methods.Keywords: empirical models, mathematical, statistical, water quality
Procedia PDF Downloads 2642059 Identifying Risk Factors for Readmission Using Decision Tree Analysis
Authors: Sıdıka Kaya, Gülay Sain Güven, Seda Karsavuran, Onur Toka
Abstract:
This study is part of an ongoing research project supported by the Scientific and Technological Research Council of Turkey (TUBITAK) under Project Number 114K404, and participation to this conference was supported by Hacettepe University Scientific Research Coordination Unit under Project Number 10243. Evaluation of hospital readmissions is gaining importance in terms of quality and cost, and is becoming the target of national policies. In Turkey, the topic of hospital readmission is relatively new on agenda and very few studies have been conducted on this topic. The aim of this study was to determine 30-day readmission rates and risk factors for readmission. Whether readmission was planned, related to the prior admission and avoidable or not was also assessed. The study was designed as a ‘prospective cohort study.’ 472 patients hospitalized in internal medicine departments of a university hospital in Turkey between February 1, 2015 and April 30, 2015 were followed up. Analyses were conducted using IBM SPSS Statistics version 22.0 and SPSS Modeler 16.0. Average age of the patients was 56 and 56% of the patients were female. Among these patients 95 were readmitted. Overall readmission rate was calculated as 20% (95/472). However, only 31 readmissions were unplanned. Unplanned readmission rate was 6.5% (31/472). Out of 31 unplanned readmission, 24 was related to the prior admission. Only 6 related readmission was avoidable. To determine risk factors for readmission we constructed Chi-square automatic interaction detector (CHAID) decision tree algorithm. CHAID decision trees are nonparametric procedures that make no assumptions of the underlying data. This algorithm determines how independent variables best combine to predict a binary outcome based on ‘if-then’ logic by portioning each independent variable into mutually exclusive subsets based on homogeneity of the data. Independent variables we included in the analysis were: clinic of the department, occupied beds/total number of beds in the clinic at the time of discharge, age, gender, marital status, educational level, distance to residence (km), number of people living with the patient, any person to help his/her care at home after discharge (yes/no), regular source (physician) of care (yes/no), day of discharge, length of stay, ICU utilization (yes/no), total comorbidity score, means for each 3 dimensions of Readiness for Hospital Discharge Scale (patient’s personal status, patient’s knowledge, and patient’s coping ability) and number of daycare admissions within 30 days of discharge. In the analysis, we included all 95 readmitted patients (46.12%), but only 111 (53.88%) non-readmitted patients, although we had 377 non-readmitted patients, to balance data. The risk factors for readmission were found as total comorbidity score, gender, patient’s coping ability, and patient’s knowledge. The strongest identifying factor for readmission was comorbidity score. If patients’ comorbidity score was higher than 1, the risk for readmission increased. The results of this study needs to be validated by other data–sets with more patients. However, we believe that this study will guide further studies of readmission and CHAID is a useful tool for identifying risk factors for readmission.Keywords: decision tree, hospital, internal medicine, readmission
Procedia PDF Downloads 2562058 The Connection Between the International Law and the Legal Consultation on the Social Media
Authors: Amir Farouk Ahmed Ali Hussin
Abstract:
Social media, such as Facebook, LinkedIn and Ex-Twitter have experienced exponential growth and a remarkable adoption rate in recent years. They give fantastic means of online social interactions and communications with family, friends, and colleagues from around the corner or across the globe, and they have become an important part of daily digital interactions for more than one and a half billion users around the world. The personal information sharing practices that social network providers encourage have led to their success as innovative social interaction platforms. Moreover, these practices have outcome in concerns with respect to privacy and security from different stakeholders. Guiding these privacy and security concerns in social networks is a must for these networks to be sustainable. Real security and privacy tools may not be enough to address existing concerns. Some points should be followed to protect users from the existing risks. In this research, we have checked the various privacy and security issues and concerns pertaining to social media. However, we have classified these privacy and security issues and presented a thorough discussion of the effects of these issues and concerns on the future of the social networks. In addition, we have presented a set of points as precaution measures that users can consider to address these issues.Keywords: international legal, consultation mix, legal research, small and medium-sized enterprises, strategic International law, strategy alignment, house of laws, deployment, production strategy, legal strategy, business strategy
Procedia PDF Downloads 632057 Governance, Risk Management, and Compliance Factors Influencing the Adoption of Cloud Computing in Australia
Authors: Tim Nedyalkov
Abstract:
A business decision to move to the cloud brings fundamental changes in how an organization develops and delivers its Information Technology solutions. The accelerated pace of digital transformation across businesses and government agencies increases the reliance on cloud-based services. They are collecting, managing, and retaining large amounts of data in cloud environments makes information security and data privacy protection essential. It becomes even more important to understand what key factors drive successful cloud adoption following the commencement of the Privacy Amendment Notifiable Data Breaches (NDB) Act 2017 in Australia as the regulatory changes impact many organizations and industries. This quantitative correlational research investigated the governance, risk management, and compliance factors contributing to cloud security success. The factors influence the adoption of cloud computing within an organizational context after the commencement of the NDB scheme. The results and findings demonstrated that corporate information security policies, data storage location, management understanding of data governance responsibilities, and regular compliance assessments are the factors influencing cloud computing adoption. The research has implications for organizations, future researchers, practitioners, policymakers, and cloud computing providers to meet the rapidly changing regulatory and compliance requirements.Keywords: cloud compliance, cloud security, data governance, privacy protection
Procedia PDF Downloads 1162056 Digital Transformation: The Effect of Artificial Intelligence on the Efficiency of Financial Administrative Workers in Peru in 2024
Authors: Thiago Fabrizio Gavilano Farje, Marcelo Patricio Herrera Malpartida
Abstract:
This study examines the influence of artificial intelligence (AI) on the work efficiency of administrative employees in the financial sector of Metropolitan Lima, Peru, during the year 2024. Focusing on the relationship between AI implementation and work efficiency, it addresses specific variables such as decision-making, motivation, and employee productivity. To accomplish the analysis between AI and work efficiency within the financial sector of Metropolitan Lima, it is necessary to evaluate how AI optimizes time in administrative tasks, examine how AI impacts the agility of the process of making decisions, and investigate the influence of AI on the satisfaction and motivation of employees. The research adopts a correlational and explanatory approach, designed to establish and understand the connections between AI and work efficiency. A survey design adapted from an OECD study is used, applying questionnaires to a representative sample of administrative workers in the financial sector who incorporate AI into their functions. The target population includes administrative workers in the financial sector of Metropolitan Lima, estimated at 73,097 employees based on data from the Censo Nacional de Empresas y Establecimientos and studies by the BCRP. The sample, selected through simple random sampling, comprises 246 workers.Keywords: business management, artificial intelligence, decision making, labor efficiency, financial sector
Procedia PDF Downloads 492055 A Dynamic Software Product Line Approach to Self-Adaptive Genetic Algorithms
Authors: Abdelghani Alidra, Mohamed Tahar Kimour
Abstract:
Genetic algorithm must adapt themselves at design time to cope with the search problem specific requirements and at runtime to balance exploration and convergence objectives. In a previous article, we have shown that modeling and implementing Genetic Algorithms (GA) using the software product line (SPL) paradigm is very appreciable because they constitute a product family sharing a common base of code. In the present article we propose to extend the use of the feature model of the genetic algorithms family to model the potential states of the GA in what is called a Dynamic Software Product Line. The objective of this paper is the systematic generation of a reconfigurable architecture that supports the dynamic of the GA and which is easily deduced from the feature model. The resultant GA is able to perform dynamic reconfiguration autonomously to fasten the convergence process while producing better solutions. Another important advantage of our approach is the exploitation of recent advances in the domain of dynamic SPLs to enhance the performance of the GAs.Keywords: self-adaptive genetic algorithms, software engineering, dynamic software product lines, reconfigurable architecture
Procedia PDF Downloads 2852054 Numerical Model for Investigation of Recombination Mechanisms in Graphene-Bonded Perovskite Solar Cells
Authors: Amir Sharifi Miavaghi
Abstract:
It is believed recombination mechnisms in graphene-bonded perovskite solar cells based on numerical model in which doped-graphene structures are employed as anode/cathode bonding semiconductor. Moreover, the dark-light current density-voltage density-voltage curves are investigated by regression analysis. Loss mechanisms such as back contact barrier, deep surface defect in the adsorbent layer is determined by adapting the simulated cell performance to the measurements using the differential evolution of the global optimization algorithm. The performance of the cell in the connection process includes J-V curves that are examined at different temperatures and open circuit voltage (V) under different light intensities as a function of temperature. Based on the proposed numerical model and the acquired loss mechanisms, our approach can be used to improve the efficiency of the solar cell further. Due to the high demand for alternative energy sources, solar cells are good alternatives for energy storage using the photovoltaic phenomenon.Keywords: numerical model, recombination mechanism, graphen, perovskite solarcell
Procedia PDF Downloads 692053 An Innovation Decision Process View in an Adoption of Total Laboratory Automation
Authors: Chia-Jung Chen, Yu-Chi Hsu, June-Dong Lin, Kun-Chen Chan, Chieh-Tien Wang, Li-Ching Wu, Chung-Feng Liu
Abstract:
With fast advances in healthcare technology, various total laboratory automation (TLA) processes have been proposed. However, adopting TLA needs quite high funding. This study explores an early adoption experience by Taiwan’s large-scale hospital group, the Chimei Hospital Group (CMG), which owns three branch hospitals (Yongkang, Liouying and Chiali, in order by service scale), based on the five stages of Everett Rogers’ Diffusion Decision Process. 1.Knowledge stage: Over the years, two weaknesses exists in laboratory department of CMG: 1) only a few examination categories (e.g., sugar testing and HbA1c) can now be completed and reported within a day during an outpatient clinical visit; 2) the Yongkang Hospital laboratory space is dispersed across three buildings, resulting in duplicated investment in analysis instruments and inconvenient artificial specimen transportation. Thus, the senior management of the department raised a crucial question, was it time to process the redesign of the laboratory department? 2.Persuasion stage: At the end of 2013, Yongkang Hospital’s new building and restructuring project created a great opportunity for the redesign of the laboratory department. However, not all laboratory colleagues had the consensus for change. Thus, the top managers arranged a series of benchmark visits to stimulate colleagues into being aware of and accepting TLA. Later, the director of the department proposed a formal report to the top management of CMG with the results of the benchmark visits, preliminary feasibility analysis, potential benefits and so on. 3.Decision stage: This TLA suggestion was well-supported by the top management of CMG and, finally, they made a decision to carry out the project with an instrument-leasing strategy. After the announcement of a request for proposal and several vendor briefings, CMG confirmed their laboratory automation architecture and finally completed the contracts. At the same time, a cross-department project team was formed and the laboratory department assigned a section leader to the National Taiwan University Hospital for one month of relevant training. 4.Implementation stage: During the implementation, the project team called for regular meetings to review the results of the operations and to offer an immediate response to the adjustment. The main project tasks included: 1) completion of the preparatory work for beginning the automation procedures; 2) ensuring information security and privacy protection; 3) formulating automated examination process protocols; 4) evaluating the performance of new instruments and the instrument connectivity; 5)ensuring good integration with hospital information systems (HIS)/laboratory information systems (LIS); and 6) ensuring continued compliance with ISO 15189 certification. 5.Confirmation stage: In short, the core process changes include: 1) cancellation of signature seals on the specimen tubes; 2) transfer of daily examination reports to a data warehouse; 3) routine pre-admission blood drawing and formal inpatient morning blood drawing can be incorporated into an automatically-prepared tube mechanism. The study summarizes below the continuous improvement orientations: (1) Flexible reference range set-up for new instruments in LIS. (2) Restructure of the specimen category. (3) Continuous review and improvements to the examination process. (4) Whether installing the tube (specimen) delivery tracks need further evaluation.Keywords: innovation decision process, total laboratory automation, health care
Procedia PDF Downloads 4192052 High-Resolution ECG Automated Analysis and Diagnosis
Authors: Ayad Dalloo, Sulaf Dalloo
Abstract:
Electrocardiogram (ECG) recording is prone to complications, on analysis by physicians, due to noise and artifacts, thus creating ambiguity leading to possible error of diagnosis. Such drawbacks may be overcome with the advent of high resolution Methods, such as Discrete Wavelet Analysis and Digital Signal Processing (DSP) techniques. This ECG signal analysis is implemented in three stages: ECG preprocessing, features extraction and classification with the aim of realizing high resolution ECG diagnosis and improved detection of abnormal conditions in the heart. The preprocessing stage involves removing spurious artifacts (noise), due to such factors as muscle contraction, motion, respiration, etc. ECG features are extracted by applying DSP and suggested sloping method techniques. These measured features represent peak amplitude values and intervals of P, Q, R, S, R’, and T waves on ECG, and other features such as ST elevation, QRS width, heart rate, electrical axis, QR and QT intervals. The classification is preformed using these extracted features and the criteria for cardiovascular diseases. The ECG diagnostic system is successfully applied to 12-lead ECG recordings for 12 cases. The system is provided with information to enable it diagnoses 15 different diseases. Physician’s and computer’s diagnoses are compared with 90% agreement, with respect to physician diagnosis, and the time taken for diagnosis is 2 seconds. All of these operations are programmed in Matlab environment.Keywords: ECG diagnostic system, QRS detection, ECG baseline removal, cardiovascular diseases
Procedia PDF Downloads 2972051 Using of Particle Swarm Optimization for Loss Minimization of Vector-Controlled Induction Motors
Authors: V. Rashtchi, H. Bizhani, F. R. Tatari
Abstract:
This paper presents a new online loss minimization for an induction motor drive. Among the many loss minimization algorithms (LMAs) for an induction motor, a particle swarm optimization (PSO) has the advantages of fast response and high accuracy. However, the performance of the PSO and other optimization algorithms depend on the accuracy of the modeling of the motor drive and losses. In the development of the loss model, there is always a trade off between accuracy and complexity. This paper presents a new online optimization to determine an optimum flux level for the efficiency optimization of the vector-controlled induction motor drive. An induction motor (IM) model in d-q coordinates is referenced to the rotor magnetizing current. This transformation results in no leakage inductance on the rotor side, thus the decomposition into d-q components in the steady-state motor model can be utilized in deriving the motor loss model. The suggested algorithm is simple for implementation.Keywords: induction machine, loss minimization, magnetizing current, particle swarm optimization
Procedia PDF Downloads 6332050 Effects of Aerobic Training on MicroRNA Let-7a Expression and Levels of Tumor Tissue IL-6 in Mice With Breast Cancer
Authors: Leila Anoosheh
Abstract:
Aim: The aim of this study was to assess The effects of aerobic training on microRNA let-7a expression and levels of tumor tissue IL-6 in mice with breast cancer. Method: Twenty BALB/c c mice (4-5 weeks,17 gr mass) were cancerous by injection of estrogen-dependent receptor breast cancer cells MC4-L2 and divided into two groups: tumor-training(TT) and tumor-control(TC) group. Then TT group completed aerobic training for 6 weeks, 5 days per week (14-18 m/min). After tumor emersion, tumor width and length were measured by digital caliper every week. 48 hours after the last exercise subjects were killed. Tissue sampling were collected and stored in -70ᵒ. Tumor tissue was homogenized and let-7a expression and IL-6 levels were accounted with Real time-PCR and ELISA Kit respectively. Statistical analysis of let-7a was conducted by the REST software. Repeated measures and independent tests were used to assess tumor size and IL-6, respectively. Results: Tumor size and IL-6 levels were significantly decreased in TT group compare with TC group (p<0.05). microRNA let-7a was increased significantly in TT against control group respectively (p=0/000). Conclusion: Reduction in tumor size, followed by aerobic exercise can be attributed to the loss of inflammatory factors such as IL-6; It seems that regarding to up regulation effects of aerobic exercise training on let-7a and down regulation effects of that on IL-6 in mice with breast cancer, This type of training can be used as adjuvant therapy in conjunction with other therapies for breast cancer.Keywords: breast cancer, aerobic training, microRNA let-7a, IL-6
Procedia PDF Downloads 4322049 An Intelligent Tutoring System Enriched with 3D Virtual Reality for Dentistry Students
Authors: Meltem Eryılmaz
Abstract:
With the emergence of the COVID-19 infection outbreak, the socio-cultural, political, economic, educational systems dynamics of the world have gone through a major change, especially in the educational field, specifically dentistry preclinical education, where the students must have a certain amount of real-time experience in endodontics and other various procedures. The totality of the digital and physical elements that make our five sense organs feel as if we really exist in a virtual world is called virtual reality. Virtual reality, which is very popular today, has started to be used in education. With the inclusion of developing technology in education and training environments, virtual learning platforms have been designed to enrich students' learning experiences. The field of health is also affected by these current developments, and the number of virtual reality applications developed for students studying dentistry is increasing day by day. The most widely used tools of this technology are virtual reality glasses. With virtual reality glasses, you can look any way you want in a world designed in 3D and navigate as you wish. With this project, solutions that will respond to different types of dental practices of students who study dentistry with virtual reality applications are produced. With this application, students who cannot find the opportunity to work with patients in distance education or who want to improve themselves at home have unlimited trial opportunities. Unity 2021, Visual Studio 2019, Cardboard SDK are used in the study.Keywords: dentistry, intelligent tutoring system, virtual reality, online learning, COVID-19
Procedia PDF Downloads 2032048 Spectral Anomaly Detection and Clustering in Radiological Search
Authors: Thomas L. McCullough, John D. Hague, Marylesa M. Howard, Matthew K. Kiser, Michael A. Mazur, Lance K. McLean, Johanna L. Turk
Abstract:
Radiological search and mapping depends on the successful recognition of anomalies in large data sets which contain varied and dynamic backgrounds. We present a new algorithmic approach for real-time anomaly detection which is resistant to common detector imperfections, avoids the limitations of a source template library and provides immediate, and easily interpretable, user feedback. This algorithm is based on a continuous wavelet transform for variance reduction and evaluates the deviation between a foreground measurement and a local background expectation using methods from linear algebra. We also present a technique for recognizing and visualizing spectrally similar clusters of data. This technique uses Laplacian Eigenmap Manifold Learning to perform dimensional reduction which preserves the geometric "closeness" of the data while maintaining sensitivity to outlying data. We illustrate the utility of both techniques on real-world data sets.Keywords: radiological search, radiological mapping, radioactivity, radiation protection
Procedia PDF Downloads 6942047 Uncertainty Estimation in Neural Networks through Transfer Learning
Authors: Ashish James, Anusha James
Abstract:
The impressive predictive performance of deep learning techniques on a wide range of tasks has led to its widespread use. Estimating the confidence of these predictions is paramount for improving the safety and reliability of such systems. However, the uncertainty estimates provided by neural networks (NNs) tend to be overconfident and unreasonable. Ensemble of NNs typically produce good predictions but uncertainty estimates tend to be inconsistent. Inspired by these, this paper presents a framework that can quantitatively estimate the uncertainties by leveraging the advances in transfer learning through slight modification to the existing training pipelines. This promising algorithm is developed with an intention of deployment in real world problems which already boast a good predictive performance by reusing those pretrained models. The idea is to capture the behavior of the trained NNs for the base task by augmenting it with the uncertainty estimates from a supplementary network. A series of experiments with known and unknown distributions show that the proposed approach produces well calibrated uncertainty estimates with high quality predictions.Keywords: uncertainty estimation, neural networks, transfer learning, regression
Procedia PDF Downloads 1352046 Seed Priming Treatments in Common Zinnia (Zinnia elegans) Using Some Plant Extracts
Authors: Atakan Efe Akpınar, Zeynep Demir
Abstract:
Seed priming technologies are frequently used nowadays to increase the germination potential and stress tolerance of seeds. These treatments might be beneficial for native species as well as crops. Different priming treatments can be used depending on the type of plant, the morphology, and the physiology of the seed. Moreover, these may be various physical, chemical, and/or biological treatments. Aiming to improve studies about seed priming, ideas need to be brought into this technological sector related to the agri-seed industry. This study addresses the question of whether seed priming with plant extracts can improve seed vigour and germination performance. By investigating the effects of plant extract priming on various vigour parameters, the research aims to provide insights into the potential benefits of this treatment method. Thus, seed priming was carried out using some plant extracts. Firstly, some plant extracts prepared from plant leaves, roots, or fruit parts were obtained for use in priming treatments. Then, seeds of Common zinnia (Zinnia elegans) were kept in solutions containing plant extracts at 20°C for 48 hours. Seeds without any treatment were evaluated as the control group. At the end of priming applications, seeds are dried superficially at 25°C. Seeds of Common zinnia (Zinnia elegans) were analyzed for vigour (normal germination rate, germination time, germination index etc.). In the future, seed priming applications can expand to multidisciplinary research combining with digital, bioinformatic and molecular tools.Keywords: seed priming, plant extracts, germination, biology
Procedia PDF Downloads 742045 The Acceptable Roles of Artificial Intelligence in the Judicial Reasoning Process
Authors: Sonia Anand Knowlton
Abstract:
There are some cases where we as a society feel deeply uncomfortable with the use of Artificial Intelligence (AI) tools in the judicial decision-making process, and justifiably so. A perfect example is COMPAS, an algorithmic model that predicts recidivism rates of offenders to assist in the determination of their bail conditions. COMPAS turned out to be extremely racist: it massively overpredicted recidivism rates of Black offenders and underpredicted recidivism rates of white offenders. At the same time, there are certain uses of AI in the judicial decision-making process that many would feel more comfortable with and even support. Take, for example, a “super-breathalyzer,” an (albeit imaginary) tool that uses AI to deliver highly detailed information about the subject of the breathalyzer test to the legal decision-makers analyzing their drunk-driving case. This article evaluates the point at which a judge’s use of AI tools begins to undermine the public’s trust in the administration of justice. It argues that the answer to this question depends on whether the AI tool is in a role in which it must perform a moral evaluation of a human being.Keywords: artificial intelligence, judicial reasoning, morality, technology, algorithm
Procedia PDF Downloads 812044 Spherical Harmonic Based Monostatic Anisotropic Point Scatterer Model for RADAR Applications
Authors: Eric Huang, Coleman DeLude, Justin Romberg, Saibal Mukhopadhyay, Madhavan Swaminathan
Abstract:
High performance computing (HPC) based emulators can be used to model the scattering from multiple stationary and moving targets for RADAR applications. These emulators rely on the RADAR Cross Section (RCS) of the targets being available in complex scenarios. Representing the RCS using tables generated from electromagnetic (EM) simulations is often times cumbersome leading to large storage requirement. This paper proposed a spherical harmonic based anisotropic scatterer model to represent the RCS of complex targets. The problem of finding the locations and reflection profiles of all scatterers can be formulated as a linear least square problem with a special sparsity constraint. This paper solves this problem using a modified Orthogonal Matching Pursuit algorithm. The results show that the spherical harmonic based scatterer model can effectively represent the RCS data of complex targets.Keywords: RADAR, RCS, high performance computing, point scatterer model
Procedia PDF Downloads 1912043 Evaluating Data Maturity in Riyadh's Nonprofit Sector: Insights Using the National Data Maturity Index (NDI)
Authors: Maryam Aloshan, Imam Mohammad Ibn Saud, Ahmad Khudair
Abstract:
This study assesses the data governance maturity of nonprofit organizations in Riyadh, Saudi Arabia, using the National Data Maturity Index (NDI) framework developed by the Saudi Data and Artificial Intelligence Authority (SDAIA). Employing a survey designed around the NDI model, data maturity levels were evaluated across 14 dimensions using a 5-point Likert scale. The results reveal a spectrum of maturity levels among the organizations surveyed: while some medium-sized associations reached the ‘Defined’ stage, others, including large associations, fell within the ‘Absence of Capabilities’ or ‘Building’ phases, with no organizations achieving the advanced ‘Established’ or ‘Pioneering’ levels. This variation suggests an emerging recognition of data governance but underscores the need for targeted interventions to bridge the maturity gap. The findings point to a significant opportunity to elevate data governance capabilities in Saudi nonprofits through customized capacity-building initiatives, including training, mentorship, and best practice sharing. This study contributes valuable insights into the digital transformation journey of the Saudi nonprofit sector, aligning with national goals for data-driven governance and organizational efficiency.Keywords: nonprofit organizations-national data maturity index (NDI), Saudi Arabia- SDAIA, data governance, data maturity
Procedia PDF Downloads 152042 An Improved Mesh Deformation Method Based on Radial Basis Function
Authors: Xuan Zhou, Litian Zhang, Shuixiang Li
Abstract:
Mesh deformation using radial basis function interpolation method has been demonstrated to produce quality meshes with relatively little computational cost using a concise algorithm. However, it still suffers from the limited deformation ability, especially in large deformation. In this paper, a pre-displacement improvement is proposed to improve the problem that illegal meshes always appear near the moving inner boundaries owing to the large relative displacement of the nodes near inner boundaries. In this improvement, nodes near the inner boundaries are first associated to the near boundary nodes, and a pre-displacement based on the displacements of associated boundary nodes is added to the nodes near boundaries in order to make the displacement closer to the boundary deformation and improve the deformation capability. Several 2D and 3D numerical simulation cases have shown that the pre-displacement improvement for radial basis function (RBF) method significantly improves the mesh quality near inner boundaries and deformation capability, with little computational burden increasement.Keywords: mesh deformation, mesh quality, background mesh, radial basis function
Procedia PDF Downloads 3662041 A Reliable Multi-Type Vehicle Classification System
Authors: Ghada S. Moussa
Abstract:
Vehicle classification is an important task in traffic surveillance and intelligent transportation systems. Classification of vehicle images is facing several problems such as: high intra-class vehicle variations, occlusion, shadow, illumination. These problems and others must be considered to develop a reliable vehicle classification system. In this study, a reliable multi-type vehicle classification system based on Bag-of-Words (BoW) paradigm is developed. Our proposed system used and compared four well-known classifiers; Linear Discriminant Analysis (LDA), Support Vector Machine (SVM), k-Nearest Neighbour (KNN), and Decision Tree to classify vehicles into four categories: motorcycles, small, medium and large. Experiments on a large dataset show that our approach is efficient and reliable in classifying vehicles with accuracy of 95.7%. The SVM outperforms other classification algorithms in terms of both accuracy and robustness alongside considerable reduction in execution time. The innovativeness of developed system is it can serve as a framework for many vehicle classification systems.Keywords: vehicle classification, bag-of-words technique, SVM classifier, LDA classifier, KNN classifier, decision tree classifier, SIFT algorithm
Procedia PDF Downloads 3582040 Optimal Load Control Strategy in the Presence of Stochastically Dependent Renewable Energy Sources
Authors: Mahmoud M. Othman, Almoataz Y. Abdelaziz, Yasser G. Hegazy
Abstract:
This paper presents a load control strategy based on modification of the Big Bang Big Crunch optimization method. The proposed strategy aims to determine the optimal load to be controlled and the corresponding time of control in order to minimize the energy purchased from substation. The presented strategy helps the distribution network operator to rely on the renewable energy sources in supplying the system demand. The renewable energy sources used in the presented study are modeled using the diagonal band Copula method and sequential Monte Carlo method in order to accurately consider the multivariate stochastic dependence between wind power, photovoltaic power and the system demand. The proposed algorithms are implemented in MATLAB environment and tested on the IEEE 37-node feeder. Several case studies are done and the subsequent discussions show the effectiveness of the proposed algorithm.Keywords: big bang big crunch, distributed generation, load control, optimization, planning
Procedia PDF Downloads 3442039 Effect of Information and Communication Intervention on Stable Economic Growth in Ethiopia
Authors: Medhin Haftom Hailu
Abstract:
The advancement of information technology has significantly impacted Ethiopia's economy, driving innovation, productivity, job creation, and global connectivity. This research examined the impact of contemporary information and communication technologies on Ethiopian economic progress. The study examined eight variables, including mobile, internet, and fixed-line penetration rates, and five macroeconomic control variables. The results showed a positive and strong effect of ICT on economic growth in Ethiopia, with 1% increase in mobile, internet, and fixed line services penetration indexes resulting in an 8.03, 10.05, and 30.06% increase in real GDP. The Granger causality test showed that all ICT variables Granger caused economic growth, but economic growth Granger caused mobile penetration rate only. The study suggests that coordinated ICT infrastructure development, increased telecom service accessibility, and increased competition in the telecom market are crucial for Ethiopia's economic growth. Ethiopia is attempting to establish a digital economy through massive investment in ensuring ICT quality and accessibility. Thus, the research could enhance in understanding of the economic impact of ICT expansion for successful ICT policy interventions for future research.Keywords: economic growth, cointegration and error correction, ICT expansion, granger causality, penetration
Procedia PDF Downloads 80