Search results for: restricted Boltzmann machine
1226 Using of Particle Swarm Optimization for Loss Minimization of Vector-Controlled Induction Motors
Authors: V. Rashtchi, H. Bizhani, F. R. Tatari
Abstract:
This paper presents a new online loss minimization for an induction motor drive. Among the many loss minimization algorithms (LMAs) for an induction motor, a particle swarm optimization (PSO) has the advantages of fast response and high accuracy. However, the performance of the PSO and other optimization algorithms depend on the accuracy of the modeling of the motor drive and losses. In the development of the loss model, there is always a trade off between accuracy and complexity. This paper presents a new online optimization to determine an optimum flux level for the efficiency optimization of the vector-controlled induction motor drive. An induction motor (IM) model in d-q coordinates is referenced to the rotor magnetizing current. This transformation results in no leakage inductance on the rotor side, thus the decomposition into d-q components in the steady-state motor model can be utilized in deriving the motor loss model. The suggested algorithm is simple for implementation.Keywords: induction machine, loss minimization, magnetizing current, particle swarm optimization
Procedia PDF Downloads 6301225 Ultra-deformable Drug-free Sequessome™ Vesicles (TDT 064) for the Treatment of Joint Pain Following Exercise: A Case Report and Clinical Data
Authors: Joe Collins, Matthias Rother
Abstract:
Background: Oral non-steroidal anti-inflammatory drugs (NSAIDs) are widely used for the relief of joint pain during and post-exercise. However, oral NSAIDs increase the risk of systemic side effects, even in healthy individuals, and retard recovery from muscle soreness. TDT 064 (Flexiseq®), a topical formulation containing ultra-deformable drug-free Sequessome™ vesicles, has demonstrated equivalent efficacy to oral celecoxib in reducing osteoarthritis-associated joint pain and stiffness. TDT 064 does not cause NSAID-related adverse effects. We describe clinical study data and a case report on the effectiveness of TDT 064 in reducing joint pain after exercise. Methods: Participants with a pain score ≥3 (10-point scale) 12–16 hours post-exercise were randomized to receive TDT 064 plus oral placebo, TDT 064 plus oral ketoprofen, or ketoprofen in ultra-deformable phospholipid vesicles plus oral placebo. Results: In the 168 study participants, pain scores were significantly higher with oral ketoprofen plus TDT 064 than with TDT 064 plus placebo in the 7 days post-exercise (P = 0.0240) and recovery from muscle soreness was significantly longer (P = 0.0262). There was a low incidence of adverse events. These data are supported by clinical experience. A 24-year-old male professional rugby player suffered a traumatic lisfranc fracture in March 2014 and underwent operative reconstruction. He had no relevant medical history and was not receiving concomitant medications. He had undergone anterior cruciate ligament reconstruction in 2008. The patient reported restricted training due to pain (score 7/10), stiffness (score 9/10) and poor function, as well as pain when changing direction and running on consecutive days. In July 2014 he started using TDT 064 twice daily at the recommended dose. In November 2014 he noted reduced pain on running (score 2-3/10), decreased morning stiffness (score 4/10) and improved joint mobility and was able to return to competitive rugby without restrictions. No side effects of TDT 064 were reported. Conclusions: TDT 064 shows efficacy against exercise- and injury-induced joint pain, as well as that associated with osteoarthritis. It does not retard muscle soreness recovery after exercise compared with an oral NSAID, making it an alternative approach for the treatment of joint pain during and post-exercise.Keywords: exercise, joint pain, TDT 064, phospholipid vesicles
Procedia PDF Downloads 4791224 Redefining Infrastructure as Code Orchestration Using AI
Authors: Georges Bou Ghantous
Abstract:
This research delves into the transformative impact of Artificial Intelligence (AI) on Infrastructure as Code (IaaC) practices, specifically focusing on the redefinition of infrastructure orchestration. By harnessing AI technologies such as machine learning algorithms and predictive analytics, organizations can achieve unprecedented levels of efficiency and optimization in managing their infrastructure resources. AI-driven IaaC introduces proactive decision-making through predictive insights, enabling organizations to anticipate and address potential issues before they arise. Dynamic resource scaling, facilitated by AI, ensures that infrastructure resources can seamlessly adapt to fluctuating workloads and changing business requirements. Through case studies and best practices, this paper sheds light on the tangible benefits and challenges associated with AI-driven IaaC transformation, providing valuable insights for organizations navigating the evolving landscape of digital infrastructure management.Keywords: artificial intelligence, infrastructure as code, efficiency optimization, predictive insights, dynamic resource scaling, proactive decision-making
Procedia PDF Downloads 321223 A Second Look at Gesture-Based Passwords: Usability and Vulnerability to Shoulder-Surfing Attacks
Authors: Lakshmidevi Sreeramareddy, Komalpreet Kaur, Nane Pothier
Abstract:
For security purposes, it is important to detect passwords entered by unauthorized users. With traditional alphanumeric passwords, if the content of a password is acquired and correctly entered by an intruder, it is impossible to differentiate the password entered by the intruder from those entered by the authorized user because the password entries contain precisely the same character set. However, no two entries for the gesture-based passwords, even those entered by the person who created the password, will be identical. There are always variations between entries, such as the shape and length of each stroke, the location of each stroke, and the speed of drawing. It is possible that passwords entered by the unauthorized user contain higher levels of variations when compared with those entered by the authorized user (the creator). The difference in the levels of variations may provide cues to detect unauthorized entries. To test this hypothesis, we designed an empirical study, collected and analyzed the data with the help of machine-learning algorithms. The results of the study are significant.Keywords: authentication, gesture-based passwords, shoulder-surfing attacks, usability
Procedia PDF Downloads 1371222 A Reliable Multi-Type Vehicle Classification System
Authors: Ghada S. Moussa
Abstract:
Vehicle classification is an important task in traffic surveillance and intelligent transportation systems. Classification of vehicle images is facing several problems such as: high intra-class vehicle variations, occlusion, shadow, illumination. These problems and others must be considered to develop a reliable vehicle classification system. In this study, a reliable multi-type vehicle classification system based on Bag-of-Words (BoW) paradigm is developed. Our proposed system used and compared four well-known classifiers; Linear Discriminant Analysis (LDA), Support Vector Machine (SVM), k-Nearest Neighbour (KNN), and Decision Tree to classify vehicles into four categories: motorcycles, small, medium and large. Experiments on a large dataset show that our approach is efficient and reliable in classifying vehicles with accuracy of 95.7%. The SVM outperforms other classification algorithms in terms of both accuracy and robustness alongside considerable reduction in execution time. The innovativeness of developed system is it can serve as a framework for many vehicle classification systems.Keywords: vehicle classification, bag-of-words technique, SVM classifier, LDA classifier, KNN classifier, decision tree classifier, SIFT algorithm
Procedia PDF Downloads 3551221 Future Metro Station: Remodeling Underground Environment Based on Experience Scenarios and IoT Technology
Authors: Joo Min Kim, Dongyoun Shin
Abstract:
The project Future Station (FS) seek for a deeper understanding of metro station. The main idea of the project is enhancing the underground environment by combining new architectural design with IoT technology. This research shows the understanding of the metro environment giving references regarding traditional design approaches and IoT combined space design. Based on the analysis, this research presents design alternatives in two metro stations those are chosen for a testbed. It also presents how the FS platform giving a response to travelers and deliver the benefit to metro operators. In conclusion, the project describes methods to build future metro service and platform that understand traveler’s intentions and giving appropriate services back for enhancing travel experience. It basically used contemporary technology such as smart sensing grid, big data analysis, smart building, and machine learning technology.Keywords: future station, digital lifestyle experience, sustainable metro, smart metro, smart city
Procedia PDF Downloads 2981220 An Investigation of the Strength Deterioration of Forged Aluminum 6082 (T6) Alloy
Authors: Rajveer, Abhinav Saxena, Sanjeev Das
Abstract:
The study is focused on the strength of forged aluminum alloy (AA) 6082 (T6). Aluminum alloy 6082 belongs to Al-Mg-Si family which has a wide range of automotive applications. A decrease in the strength of AA 6082 alloy was observed after T6 treatment. The as-received (extruded), forged, and forged + heat treated samples were examined to understand the reason. These examinations were accomplished by optical (OM) and scanning electron microscope (SEM) and X-ray diffraction (XRD) studies. It was observed that the defects had an insignificant effect on the alloy strength. The alloy samples were subjected to age hardening treatment and the time to achieve peak hardening was acquired. Standard tensile specimens were prepared from as-received (extruded), forged, forged + solutionized and forged + solutionized + age hardened. Tensile tests were conducted by Instron universal testing machine. It was observed that there was a significant drop in tensile strength in the case of solutionized sample. The detailed study of the fracture samples showed that the solutionizing after forging was not the best way to increase the strength of Al 6082 alloy.Keywords: aluminum alloy 6082, strength, forging, age hardening
Procedia PDF Downloads 4301219 Wear and Fraction Behavior of Porcelain Coated with Polyurethane/SiO2 Coating Layer
Authors: Ching Yern Chee
Abstract:
Various loading of nano silica is added into polyurethane (PU) and then coated on porcelain substrate. The wear and friction properties of the porcelain substrates coated with polyurethane/nano silica nano composite coatings were investigated using the reciprocating wear testing machine. The friction and wear test of polyurethane/nano silica coated porcelain substrate was studied at different sliding speed and applied load. It was found that the optimum composition of nano silica is 3 wt% which gives the lowest friction coefficient and wear rate in all applied load ranges and sliding speeds. For 3 wt% nano silica filled PU coated porcelain substrate, the increment of sliding speed caused higher wear rates but lower frictions coefficient. Besides, the friction coefficient of nano silica filled PU coated porcelain substrate decreased but the wear rate increased with the applied load.Keywords: porcelain, nanocomposite coating, morphology, friction, wear behavior
Procedia PDF Downloads 5281218 An Evaluation Model for Enhancing Flexibility in Production Systems through Additive Manufacturing
Authors: Angela Luft, Sebastian Bremen, Nicolae Balc
Abstract:
Additive manufacturing processes have entered large parts of the industry and their range of application have progressed and grown significantly in the course of time. A major advantage of additive manufacturing is the innate flexibility of the machines. This corelates with the ongoing demand of creating highly flexible production environments. However, the potential of additive manufacturing technologies to enhance the flexibility of production systems has not yet been truly considered and quantified in a systematic way. In order to determine the potential of additive manufacturing technologies with regards to the strategic flexibility design in production systems, an integrated evaluation model has been developed, that allows for the simultaneous consideration of both conventional as well as additive production resources. With the described model, an operational scope of action can be identified and quantified in terms of mix and volume flexibility, process complexity, and machine capacity that goes beyond the current cost-oriented approaches and offers a much broader and more holistic view on the potential of additive manufacturing. A respective evaluation model is presented this paper.Keywords: additive manufacturing, capacity planning, production systems, strategic production planning, flexibility enhancement
Procedia PDF Downloads 1551217 Emerging Threats and Adaptive Defenses: Navigating the Future of Cybersecurity in a Hyperconnected World
Authors: Olasunkanmi Jame Ayodeji, Adebayo Adeyinka Victor
Abstract:
In a hyperconnected world, cybersecurity faces a continuous evolution of threats that challenge traditional defence mechanisms. This paper explores emerging cybersecurity threats like malware, ransomware, phishing, social engineering, and the Internet of Things (IoT) vulnerabilities. It delves into the inadequacies of existing cybersecurity defences in addressing these evolving risks and advocates for adaptive defence mechanisms that leverage AI, machine learning, and zero-trust architectures. The paper proposes collaborative approaches, including public-private partnerships and information sharing, as essential to building a robust defence strategy to address future cyber threats. The need for continuous monitoring, real-time incident response, and adaptive resilience strategies is highlighted to fortify digital infrastructures in the face of escalating global cyber risks.Keywords: cybersecurity, hyperconnectivity, malware, adaptive defences, zero-trust architecture, internet of things vulnerabilities
Procedia PDF Downloads 191216 Assessing Relationships between Glandularity and Gray Level by Using Breast Phantoms
Authors: Yun-Xuan Tang, Pei-Yuan Liu, Kun-Mu Lu, Min-Tsung Tseng, Liang-Kuang Chen, Yuh-Feng Tsai, Ching-Wen Lee, Jay Wu
Abstract:
Breast cancer is predominant of malignant tumors in females. The increase in the glandular density increases the risk of breast cancer. BI-RADS is a frequently used density indicator in mammography; however, it significantly overestimates the glandularity. Therefore, it is very important to accurately and quantitatively assess the glandularity by mammography. In this study, 20%, 30% and 50% glandularity phantoms were exposed using a mammography machine at 28, 30 and 31 kVp, and 30, 55, 80 and 105 mAs, respectively. The regions of interest (ROIs) were drawn to assess the gray level. The relationship between the glandularity and gray level under various compression thicknesses, kVp, and mAs was established by the multivariable linear regression. A phantom verification was performed with automatic exposure control (AEC). The regression equation was obtained with an R-square value of 0.928. The average gray levels of the verification phantom were 8708, 8660 and 8434 for 0.952, 0.963 and 0.985 g/cm3, respectively. The percent differences of glandularity to the regression equation were 3.24%, 2.75% and 13.7%. We concluded that the proposed method could be clinically applied in mammography to improve the glandularity estimation and further increase the importance of breast cancer screening.Keywords: mammography, glandularity, gray value, BI-RADS
Procedia PDF Downloads 4901215 Performance of Neural Networks vs. Radial Basis Functions When Forming a Metamodel for Residential Buildings
Authors: Philip Symonds, Jon Taylor, Zaid Chalabi, Michael Davies
Abstract:
With the world climate projected to warm and major cities in developing countries becoming increasingly populated and polluted, governments are tasked with the problem of overheating and air quality in residential buildings. This paper presents the development of an adaptable model of these risks. Simulations are performed using the EnergyPlus building physics software. An accurate metamodel is formed by randomly sampling building input parameters and training on the outputs of EnergyPlus simulations. Metamodels are used to vastly reduce the amount of computation time required when performing optimisation and sensitivity analyses. Neural Networks (NNs) are compared to a Radial Basis Function (RBF) algorithm when forming a metamodel. These techniques were implemented using the PyBrain and scikit-learn python libraries, respectively. NNs are shown to perform around 15% better than RBFs when estimating overheating and air pollution metrics modelled by EnergyPlus.Keywords: neural networks, radial basis functions, metamodelling, python machine learning libraries
Procedia PDF Downloads 4451214 The Role of Artificial Intelligence in Concrete Constructions
Authors: Ardalan Tofighi Soleimandarabi
Abstract:
Artificial intelligence has revolutionized the concrete construction industry and improved processes by increasing efficiency, accuracy, and sustainability. This article examines the applications of artificial intelligence in predicting the compressive strength of concrete, optimizing mixing plans, and improving structural health monitoring systems. Artificial intelligence-based models, such as artificial neural networks (ANN) and combined machine learning techniques, have shown better performance than traditional methods in predicting concrete properties. In addition, artificial intelligence systems have made it possible to improve quality control and real-time monitoring of structures, which helps in preventive maintenance and increases the life of infrastructure. Also, the use of artificial intelligence plays an effective role in sustainable construction by optimizing material consumption and reducing waste. Although the implementation of artificial intelligence is associated with challenges such as high initial costs and the need for specialized training, it will create a smarter, more sustainable, and more affordable future for concrete structures.Keywords: artificial intelligence, concrete construction, compressive strength prediction, structural health monitoring, stability
Procedia PDF Downloads 141213 SVID: Structured Vulnerability Intelligence for Building Deliberated Vulnerable Environment
Authors: Wenqing Fan, Yixuan Cheng, Wei Huang
Abstract:
The diversity and complexity of modern IT systems make it almost impossible for internal teams to find vulnerabilities in all software before the software is officially released. The emergence of threat intelligence and vulnerability reporting policy has greatly reduced the burden on software vendors and organizations to find vulnerabilities. However, to prove the existence of the reported vulnerability, it is necessary but difficult for security incident response team to build a deliberated vulnerable environment from the vulnerability report with limited and incomplete information. This paper presents a structured, standardized, machine-oriented vulnerability intelligence format, that can be used to automate the orchestration of Deliberated Vulnerable Environment (DVE). This paper highlights the important role of software configuration and proof of vulnerable specifications in vulnerability intelligence, and proposes a triad model, which is called DIR (Dependency Configuration, Installation Configuration, Runtime Configuration), to define software configuration. Finally, this paper has also implemented a prototype system to demonstrate that the orchestration of DVE can be automated with the intelligence.Keywords: DIR triad model, DVE, vulnerability intelligence, vulnerability recurrence
Procedia PDF Downloads 1191212 Modelling of Powered Roof Supports Work
Authors: Marcin Michalak
Abstract:
Due to the increasing efforts on saving our natural environment a change in the structure of energy resources can be observed - an increasing fraction of a renewable energy sources. In many countries traditional underground coal mining loses its significance but there are still countries, like Poland or Germany, in which the coal based technologies have the greatest fraction in a total energy production. This necessitates to make an effort to limit the costs and negative effects of underground coal mining. The longwall complex is as essential part of the underground coal mining. The safety and the effectiveness of the work is strongly dependent of the diagnostic state of powered roof supports. The building of a useful and reliable diagnostic system requires a lot of data. As the acquisition of a data of any possible operating conditions it is important to have a possibility to generate a demanded artificial working characteristics. In this paper a new approach of modelling a leg pressure in the single unit of powered roof support. The model is a result of the analysis of a typical working cycles.Keywords: machine modelling, underground mining, coal mining, structure
Procedia PDF Downloads 3661211 A Minimally Invasive Approach Using Bio-Miniatures Implant System for Full Arch Rehabilitation
Authors: Omid Allan
Abstract:
The advent of ultra-narrow diameter implants initially offered an alternative to wider conventional implants. However, their design limitations have restricted their applicability primarily to overdentures and cement-retained fixed prostheses, often with unpredictable long-term outcomes. The introduction of the new Miniature Implants has revolutionized the field of implant dentistry, leading to a more streamlined approach. The utilization of Miniature Implants has emerged as a promising alternative to the traditional approach that entails the traumatic sequential bone drilling procedures and the use of conventional implants for full and partial arch restorations. The innovative "BioMiniatures Implant System serves as a groundbreaking bridge connecting mini implants with standard implant systems. This system allows practitioners to harness the advantages of ultra-small implants, enabling minimally invasive insertion and facilitating the application of fixed screw-retained prostheses, which were only available to conventional wider implant systems. This approach streamlines full and partial arch rehabilitation with minimal or even no bone drilling, significantly reducing surgical risks and complications for clinicians while minimizing patient morbidity. The ultra-narrow diameter and self-advancing features of these implants eliminate the need for invasive and technically complex procedures such as bone augmentation and guided bone regeneration (GBR), particularly in cases involving thin alveolar ridges. Furthermore, the absence of a microcap between the implant and abutment eliminates the potential for micro-leakage and micro-pumping effects, effectively mitigating the risk of marginal bone loss and future peri-implantitis. The cumulative experience of restoring over 50 full and partial arch edentulous cases with this system has yielded an outstanding success rate exceeding 97%. The long-term success with a stable marginal bone level in the study firmly establishes these implants as a dependable alternative to conventional implants, especially for full arch rehabilitation cases. Full arch rehabilitation with these implants holds the promise of providing a simplified solution for edentulous patients who typically present with atrophic narrow alveolar ridges, eliminating the need for extensive GBR and bone augmentation to restore their dentition with fixed prostheses.Keywords: mini-implant, biominiatures, miniature implants, minimally invasive dentistry, full arch rehabilitation
Procedia PDF Downloads 721210 Frequency Response of Complex Systems with Localized Nonlinearities
Authors: E. Menga, S. Hernandez
Abstract:
Finite Element Models (FEMs) are widely used in order to study and predict the dynamic properties of structures and usually, the prediction can be obtained with much more accuracy in the case of a single component than in the case of assemblies. Especially for structural dynamics studies, in the low and middle frequency range, most complex FEMs can be seen as assemblies made by linear components joined together at interfaces. From a modelling and computational point of view, these types of joints can be seen as localized sources of stiffness and damping and can be modelled as lumped spring/damper elements, most of time, characterized by nonlinear constitutive laws. On the other side, most of FE programs are able to run nonlinear analysis in time-domain. They treat the whole structure as nonlinear, even if there is one nonlinear degree of freedom (DOF) out of thousands of linear ones, making the analysis unnecessarily expensive from a computational point of view. In this work, a methodology in order to obtain the nonlinear frequency response of structures, whose nonlinearities can be considered as localized sources, is presented. The work extends the well-known Structural Dynamic Modification Method (SDMM) to a nonlinear set of modifications, and allows getting the Nonlinear Frequency Response Functions (NLFRFs), through an ‘updating’ process of the Linear Frequency Response Functions (LFRFs). A brief summary of the analytical concepts is given, starting from the linear formulation and understanding what the implications of the nonlinear one, are. The response of the system is formulated in both: time and frequency domain. First the Modal Database is extracted and the linear response is calculated. Secondly the nonlinear response is obtained thru the NL SDMM, by updating the underlying linear behavior of the system. The methodology, implemented in MATLAB, has been successfully applied to estimate the nonlinear frequency response of two systems. The first one is a two DOFs spring-mass-damper system, and the second example takes into account a full aircraft FE Model. In spite of the different levels of complexity, both examples show the reliability and effectiveness of the method. The results highlight a feasible and robust procedure, which allows a quick estimation of the effect of localized nonlinearities on the dynamic behavior. The method is particularly powerful when most of the FE Model can be considered as acting linearly and the nonlinear behavior is restricted to few degrees of freedom. The procedure is very attractive from a computational point of view because the FEM needs to be run just once, which allows faster nonlinear sensitivity analysis and easier implementation of optimization procedures for the calibration of nonlinear models.Keywords: frequency response, nonlinear dynamics, structural dynamic modification, softening effect, rubber
Procedia PDF Downloads 2651209 Application of Improved Semantic Communication Technology in Remote Sensing Data Transmission
Authors: Tingwei Shu, Dong Zhou, Chengjun Guo
Abstract:
Semantic communication is an emerging form of communication that realize intelligent communication by extracting semantic information of data at the source and transmitting it, and recovering the data at the receiving end. It can effectively solve the problem of data transmission under the situation of large data volume, low SNR and restricted bandwidth. With the development of Deep Learning, semantic communication further matures and is gradually applied in the fields of the Internet of Things, Uumanned Air Vehicle cluster communication, remote sensing scenarios, etc. We propose an improved semantic communication system for the situation where the data volume is huge and the spectrum resources are limited during the transmission of remote sensing images. At the transmitting, we need to extract the semantic information of remote sensing images, but there are some problems. The traditional semantic communication system based on Convolutional Neural Network cannot take into account the global semantic information and local semantic information of the image, which results in less-than-ideal image recovery at the receiving end. Therefore, we adopt the improved vision-Transformer-based structure as the semantic encoder instead of the mainstream one using CNN to extract the image semantic features. In this paper, we first perform pre-processing operations on remote sensing images to improve the resolution of the images in order to obtain images with more semantic information. We use wavelet transform to decompose the image into high-frequency and low-frequency components, perform bilinear interpolation on the high-frequency components and bicubic interpolation on the low-frequency components, and finally perform wavelet inverse transform to obtain the preprocessed image. We adopt the improved Vision-Transformer structure as the semantic coder to extract and transmit the semantic information of remote sensing images. The Vision-Transformer structure can better train the huge data volume and extract better image semantic features, and adopt the multi-layer self-attention mechanism to better capture the correlation between semantic features and reduce redundant features. Secondly, to improve the coding efficiency, we reduce the quadratic complexity of the self-attentive mechanism itself to linear so as to improve the image data processing speed of the model. We conducted experimental simulations on the RSOD dataset and compared the designed system with a semantic communication system based on CNN and image coding methods such as BGP and JPEG to verify that the method can effectively alleviate the problem of excessive data volume and improve the performance of image data communication.Keywords: semantic communication, transformer, wavelet transform, data processing
Procedia PDF Downloads 771208 Algorithm for Path Recognition in-between Tree Rows for Agricultural Wheeled-Mobile Robots
Authors: Anderson Rocha, Pedro Miguel de Figueiredo Dinis Oliveira Gaspar
Abstract:
Machine vision has been widely used in recent years in agriculture, as a tool to promote the automation of processes and increase the levels of productivity. The aim of this work is the development of a path recognition algorithm based on image processing to guide a terrestrial robot in-between tree rows. The proposed algorithm was developed using the software MATLAB, and it uses several image processing operations, such as threshold detection, morphological erosion, histogram equalization and the Hough transform, to find edge lines along tree rows on an image and to create a path to be followed by a mobile robot. To develop the algorithm, a set of images of different types of orchards was used, which made possible the construction of a method capable of identifying paths between trees of different heights and aspects. The algorithm was evaluated using several images with different characteristics of quality and the results showed that the proposed method can successfully detect a path in different types of environments.Keywords: agricultural mobile robot, image processing, path recognition, hough transform
Procedia PDF Downloads 1461207 The Mental Health of Indigenous People During the COVID-19 Pandemic: A Scoping Review
Authors: Suzanne L. Stewart, Sarah J. Ponton, Mikaela D. Gabriel, Roy Strebel, Xinyi Lu
Abstract:
Indigenous Peoples have faced unique barriers to accessing and receiving culturally safe and appropriate mental health care while also facing daunting rates of mental health diagnoses and comorbidities. Indigenous researchers and clinicians have well established the connection of the current mental health issues in Indigenous communities as a direct result of colonization by way of intergenerational trauma throughout Canada’s colonial history. Such mental health barriers and challenges have become exacerbated during the COVID-19 pandemic. Throughout the pandemic, access to mental health, cultural, ceremonial, and community services were severely impacted and restricted; however, it is these same cultural activities and community resources that are key to supporting Indigenous mental health from a traditional and community-based perspective. This research employed a unique combination of a thorough, analytical scoping review of the existent mental health literature of Indigenous mental health in the COVID-19 pandemic, alongside narrative interviews employing an oral storytelling tradition methodology with key community informants that provide comprehensive cultural services to the Indigenous community of Toronto, as well as across Canada. These key informant interviews provided a wealth of insights into virtual transitions of Indigenous care and mental health support; intersections of historical underfunding and current financial navigation in technology infrastructure; accessibility and connection with Indigenous youth in remote locations; as well as maintaining community involvement and traditional practices in a current pandemic. Both the scoping review and narrative interviews were meticulously analyzed for overarching narrative themes to best explore the extent of the literature on Indigenous mental health and services during COVID-19; identify gaps in this literature; identify barriers and supports for the Indigenous community, and explore the intersection of community and cultural impacts to mental health. Themes of the scoping review included: Historical Context; Challenges in Culturally-Based Services; and Strengths in Culturally-Based Services. Meta themes across narrative interviews included: Virtual Transitions; Financial Support for Indigenous Services; Health Service Delivery & Wellbeing; and Culture & Community Connection. The results of this scoping review and narrative interviews provide wide application and contribution to the mental health literature, as well as recommendations for policy, service provision, autonomy in Indigenous health and wellbeing, and crucial insights into the present and enduring mental health needs of Indigenous Peoples throughout the COVID-19 pandemic.Keywords: indigenous community services, indigenous mental health, indigenous scoping review, indigenous peoples and Covid-19
Procedia PDF Downloads 2401206 A Semi-supervised Classification Approach for Trend Following Investment Strategy
Authors: Rodrigo Arnaldo Scarpel
Abstract:
Trend following is a widely accepted investment strategy that adopts a rule-based trading mechanism that rather than striving to predict market direction or on information gathering to decide when to buy and when to sell a stock. Thus, in trend following one must respond to market’s movements that has recently happen and what is currently happening, rather than on what will happen. Optimally, in trend following strategy, is to catch a bull market at its early stage, ride the trend, and liquidate the position at the first evidence of the subsequent bear market. For applying the trend following strategy one needs to find the trend and identify trade signals. In order to avoid false signals, i.e., identify fluctuations of short, mid and long terms and to separate noise from real changes in the trend, most academic works rely on moving averages and other technical analysis indicators, such as the moving average convergence divergence (MACD) and the relative strength index (RSI) to uncover intelligible stock trading rules following trend following strategy philosophy. Recently, some works has applied machine learning techniques for trade rules discovery. In those works, the process of rule construction is based on evolutionary learning which aims to adapt the rules to the current environment and searches for the global optimum rules in the search space. In this work, instead of focusing on the usage of machine learning techniques for creating trading rules, a time series trend classification employing a semi-supervised approach was used to early identify both the beginning and the end of upward and downward trends. Such classification model can be employed to identify trade signals and the decision-making procedure is that if an up-trend (down-trend) is identified, a buy (sell) signal is generated. Semi-supervised learning is used for model training when only part of the data is labeled and Semi-supervised classification aims to train a classifier from both the labeled and unlabeled data, such that it is better than the supervised classifier trained only on the labeled data. For illustrating the proposed approach, it was employed daily trade information, including the open, high, low and closing values and volume from January 1, 2000 to December 31, 2022, of the São Paulo Exchange Composite index (IBOVESPA). Through this time period it was visually identified consistent changes in price, upwards or downwards, for assigning labels and leaving the rest of the days (when there is not a consistent change in price) unlabeled. For training the classification model, a pseudo-label semi-supervised learning strategy was used employing different technical analysis indicators. In this learning strategy, the core is to use unlabeled data to generate a pseudo-label for supervised training. For evaluating the achieved results, it was considered the annualized return and excess return, the Sortino and the Sharpe indicators. Through the evaluated time period, the obtained results were very consistent and can be considered promising for generating the intended trading signals.Keywords: evolutionary learning, semi-supervised classification, time series data, trading signals generation
Procedia PDF Downloads 881205 Detection of Cardiac Arrhythmia Using Principal Component Analysis and Xgboost Model
Authors: Sujay Kotwale, Ramasubba Reddy M.
Abstract:
Electrocardiogram (ECG) is a non-invasive technique used to study and analyze various heart diseases. Cardiac arrhythmia is a serious heart disease which leads to death of the patients, when left untreated. An early-time detection of cardiac arrhythmia would help the doctors to do proper treatment of the heart. In the past, various algorithms and machine learning (ML) models were used to early-time detection of cardiac arrhythmia, but few of them have achieved better results. In order to improve the performance, this paper implements principal component analysis (PCA) along with XGBoost model. The PCA was implemented to the raw ECG signals which suppress redundancy information and extracted significant features. The obtained significant ECG features were fed into XGBoost model and the performance of the model was evaluated. In order to valid the proposed technique, raw ECG signals obtained from standard MIT-BIH database were employed for the analysis. The result shows that the performance of proposed method is superior to the several state-of-the-arts techniques.Keywords: cardiac arrhythmia, electrocardiogram, principal component analysis, XGBoost
Procedia PDF Downloads 1171204 A NoSQL Based Approach for Real-Time Managing of Robotics's Data
Authors: Gueidi Afef, Gharsellaoui Hamza, Ben Ahmed Samir
Abstract:
This paper deals with the secret of the continual progression data that new data management solutions have been emerged: The NoSQL databases. They crossed several areas like personalization, profile management, big data in real-time, content management, catalog, view of customers, mobile applications, internet of things, digital communication and fraud detection. Nowadays, these database management systems are increasing. These systems store data very well and with the trend of big data, a new challenge’s store demands new structures and methods for managing enterprise data. The new intelligent machine in the e-learning sector, thrives on more data, so smart machines can learn more and faster. The robotics are our use case to focus on our test. The implementation of NoSQL for Robotics wrestle all the data they acquire into usable form because with the ordinary type of robotics; we are facing very big limits to manage and find the exact information in real-time. Our original proposed approach was demonstrated by experimental studies and running example used as a use case.Keywords: NoSQL databases, database management systems, robotics, big data
Procedia PDF Downloads 3521203 Online Learning Versus Face to Face Learning: A Sentiment Analysis on General Education Mathematics in the Modern World of University of San Carlos School of Arts and Sciences Students Using Natural Language Processing
Authors: Derek Brandon G. Yu, Clyde Vincent O. Pilapil, Christine F. Peña
Abstract:
College students of Cebu province have been indoors since March 2020, and a challenge encountered is the sudden shift from face to face to online learning and with the lack of empirical data on online learning on Higher Education Institutions (HEIs) in the Philippines. Sentiments on face to face and online learning will be collected from University of San Carlos (USC), School of Arts and Sciences (SAS) students regarding Mathematics in the Modern World (MMW), a General Education (GE) course. Natural Language Processing with machine learning algorithms will be used to classify the sentiments of the students. Results of the research study are the themes identified through topic modelling and the overall sentiments of the students in USC SASKeywords: natural language processing, online learning, sentiment analysis, topic modelling
Procedia PDF Downloads 2441202 The Proactive Approach of Digital Forensics Methodology against Targeted Attack Malware
Authors: Mohamed Fadzlee Sulaiman, Mohd Zabri Adil Talib, Aswami Fadillah Mohd Ariffin
Abstract:
Each individual organization has their own mechanism to build up cyber defense capability in protecting their information infrastructures from data breaches and cyber espionage. But, we can not deny the possibility of failing to detect and stop cyber attacks especially for those targeting credential information and intellectual property (IP). In this paper, we would like to share the modern approach of effective digital forensic methodology in order to identify the artifacts in tracing the trails of evidence while mitigating the infection from the target machine/s. This proposed approach will suit the digital forensic investigation to be conducted while resuming the business critical operation after mitigating the infection and minimizing the risk from the identified attack to transpire. Therefore, traditional digital forensics methodology has to be improvised to be proactive which not only focusing to discover the root caused and the threat actor but to develop the relevant mitigation plan in order to prevent from the same attack.Keywords: digital forensic, detection, eradication, targeted attack, malware
Procedia PDF Downloads 2731201 Exploring Acceptance of Artificial Intelligence Software Solution Amongst Healthcare Personnel: A Case in a Private Medical Centre
Authors: Sandra So, Mohd Roslan Ismail, Safurah Jaafar
Abstract:
With the rapid proliferation of data in healthcare has provided an opportune platform creation of Artificial Intelligence (AI). AI has brought a paradigm shift for healthcare professionals, promising improvement in delivery and quality. This study aims to determine the perception of healthcare personnel on perceived ease of use, perceived usefulness, and subjective norm toward attitude for artificial intelligence acceptance. A cross-sectional single institutional study of employees’ perception of adopting AI in the hospital was conducted. The survey was conducted using a questionnaire adapted from Technology Acceptance Model and a four-point Likert scale was used. There were 96 or 75.5% of the total population responded. This study has shown the significant relationship and the importance of ease of use, perceived usefulness, and subjective norm to the acceptance of AI. In the study results, it concluded that the determining factor to the strong acceptance of AI in their practices is mostly those respondents with the most interaction with the patients and clinical management.Keywords: artificial intelligence, machine learning, perceived ease of use, perceived usefulness, subjective norm
Procedia PDF Downloads 2251200 Improvement of Thermal Stability in Ethylene Methyl Acrylate Composites for Gasket Application
Authors: Pemika Ketsuwan, Pitt Supaphol, Manit Nithitanakul
Abstract:
A typical used of ethylene methyl acrylate (EMA) gasket is in the manufacture of optical lens, and often, they are deteriorated rapidly due to high temperature during the process. The objective of this project is to improve the thermal stability of the EMA copolymer gasket by preparing EMA with cellulose and silica composites. Hydroxy propyl methyl cellulose (HPMC) and Carboxy methyl cellulose (CMC) were used in preparing of EMA/cellulose composites and fumed silica (SiO2) was used in preparing EMA/silica composites with different amounts of filler (3, 5, 7, 10, 15 wt.%), using a twin screw extruder at 160 °C and the test specimens were prepared by the injection molding machine. The morphology and dispersion of fillers in the EMA matrix were investigated by field emission scanning electron microscopy (FESEM). The thermal stability of the composite was determined by thermal gravimetric analysis (TGA), and differential scanning calorimeter (DSC). Mechanical properties were evaluated by tensile testing. The developed composites were found to enhance thermal and mechanical properties when compared to that of the EMA copolymer alone.Keywords: ethylene methyl acrylate, HPMC, Silica, Thermal stability
Procedia PDF Downloads 1201199 Analysis of Metamaterial Permeability on the Performance of Loosely Coupled Coils
Authors: Icaro V. Soares, Guilherme L. F. Brandao, Ursula D. C. Resende, Glaucio L. Siqueira
Abstract:
Electrical energy can be wirelessly transmitted through resonant coupled coils that operate in the near-field region. Once in this region, the field has evanescent character, the efficiency of Resonant Wireless Power Transfer (RWPT) systems decreases proportionally with the inverse cube of distance between the transmitter and receiver coils. The commercially available RWPT systems are restricted to short and mid-range applications in which the distance between coils is lesser or equal to the coil size. An alternative to overcome this limitation is applying metamaterial structures to enhance the coupling between coils, thus reducing the field decay along the distance between them. Metamaterials can be conceived as composite materials with periodic or non-periodic structure whose unconventional electromagnetic behaviour is due to its unit cell disposition and chemical composition. This new kind of material has been used in frequency selective surfaces, invisibility cloaks, leaky-wave antennas, among other applications. However, for RWPT it is mainly applied as superlenses which are lenses that can overcome the optical limitation and are made of left-handed media, that is, a medium with negative magnetic permeability and electric permittivity. As RWPT systems usually operate at wavelengths of hundreds of meters, the metamaterial unit cell size is much smaller than the wavelength. In this case, electric and magnetic field are decoupled, therefore the double negative condition for superlenses are not required and the negative magnetic permeability is enough to produce an artificial magnetic medium. In this work, the influence of the magnetic permeability of a metamaterial slab inserted between two loosely coupled coils is studied in order to find the condition that leads to the maximum transmission efficiency. The metamaterial used is formed by a subwavelength unit cell that consist of a capacitor-loaded split ring with an inner spiral that is designed and optimized using the software Computer Simulation Technology. The unit cell permeability is experimentally characterized by the ratio of the transmission parameters between coils measured with and without the presence of the metamaterial slab. Early measurements results show that the transmission coefficient at the resonant frequency after the inclusion of the metamaterial is about three times higher than with just the two coils, which confirms the enhancement that this structure brings to RWPT systems.Keywords: electromagnetic lens, loosely coupled coils, magnetic permeability, metamaterials, resonant wireless power transfer, subwavelength unit cells
Procedia PDF Downloads 1451198 Large Neural Networks Learning From Scratch With Very Few Data and Without Explicit Regularization
Authors: Christoph Linse, Thomas Martinetz
Abstract:
Recent findings have shown that Neural Networks generalize also in over-parametrized regimes with zero training error. This is surprising, since it is completely against traditional machine learning wisdom. In our empirical study we fortify these findings in the domain of fine-grained image classification. We show that very large Convolutional Neural Networks with millions of weights do learn with only a handful of training samples and without image augmentation, explicit regularization or pretraining. We train the architectures ResNet018, ResNet101 and VGG19 on subsets of the difficult benchmark datasets Caltech101, CUB_200_2011, FGVCAircraft, Flowers102 and StanfordCars with 100 classes and more, perform a comprehensive comparative study and draw implications for the practical application of CNNs. Finally, we show that VGG19 with 140 million weights learns to distinguish airplanes and motorbikes with up to 95% accuracy using only 20 training samples per class.Keywords: convolutional neural networks, fine-grained image classification, generalization, image recognition, over-parameterized, small data sets
Procedia PDF Downloads 871197 Design and Manufacture Detection System for Patient's Unwanted Movements during Radiology and CT Scan
Authors: Anita Yaghobi, Homayoun Ebrahimian
Abstract:
One of the important tools that can help orthopedic doctors for diagnose diseases is imaging scan. Imaging techniques can help physicians in see different parts of the body, including the bones, muscles, tendons, nerves, and cartilage. During CT scan, a patient must be in the same position from the start to the end of radiation treatment. Patient movements are usually monitored by the technologists through the closed circuit television (CCTV) during scan. If the patient makes a small movement, it is difficult to be noticed by them. In the present work, a simple patient movement monitoring device is fabricated to monitor the patient movement. It uses an electronic sensing device. It continuously monitors the patient’s position while the CT scan is in process. The device has been retrospectively tested on 51 patients whose movement and distance were measured. The results show that 25 patients moved 1 cm to 2.5 cm from their initial position during the CT scan. Hence, the device can potentially be used to control and monitor patient movement during CT scan and Radiography. In addition, an audible alarm situated at the control panel of the control room is provided with this device to alert the technologists. It is an inexpensive, compact device which can be used in any CT scan machine.Keywords: CT scan, radiology, X Ray, unwanted movement
Procedia PDF Downloads 458