Search results for: principal objects
1687 High Touch Objects and Infection Control in Intensive Care Units
Authors: Shakiera Sallie, Angela James
Abstract:
Global concern about healthcare-associated infections through the transmission of microorganisms, resulting in outbreaks in overcrowded intensive care units (ICU), is current. Medical equipment and surfaces in the immediate patient zone, the high-touch objects, may become contaminated. A study was conducted across six intensive care units in a healthcare facility to determine the understanding and practice of the cleaning of high-touch objects (HTO), and an intervention program was undertaken. A mixed-method approach with the selection of ICUs, HTOs, and healthcare personnel was undertaken. Data collection included Ultra-Violet instruments, a questionnaire, and an intervention. In the pre-intervention, 41 (52.5%) of the healthcare personnel (n=78) rated their understanding of HTOs as “sufficient”; post-intervention, it was 67 (75%), (n=89), p=0.0015, indicates an improvement. The UV stamp percentage compliance to indicate whether cleaning of the HTOs had taken place across the six intensive care units before the intervention ranged from 0% compliance to 88% compliance, and after, it ranged from 67% to 91%. An intervention program on the cleaning of HTOs and the transmission cycle of microorganisms in the ICUs enhanced the healthcare personnel’s understanding and practices on the importance of environmental cleaning.Keywords: high touch objects, infections, intensive care units, intervention program, microorganisms
Procedia PDF Downloads 1461686 Comparison of Tensile Strength and Folding Endurance of (FDM Process) 3D Printed ABS and PLA Materials
Authors: R. Devicharan
Abstract:
In a short span 3D Printing is expected to play a vital role in our life. The possibility of creativity and speed in manufacturing through various 3D printing processes is infinite. This study is performed on the FDM (Fused Deposition Modelling) method of 3D printing, which is one of the pre-dominant methods of 3D printing technologies. This study focuses on physical properties of the objects produced by 3D printing which determine the applications of the 3D printed objects. This paper specifically aims at the study of the tensile strength and the folding endurance of the 3D printed objects through the FDM (Fused Deposition Modelling) method using the ABS (Acronitirile Butadiene Styrene) and PLA (Poly Lactic Acid) plastic materials. The study is performed on a controlled environment and the specific machine settings. Appropriate tables, graphs are plotted and research analysis techniques will be utilized to analyse, verify and validate the experiment results.Keywords: FDM process, 3D printing, ABS for 3D printing, PLA for 3D printing, rapid prototyping
Procedia PDF Downloads 5991685 Vibration Imaging Method for Vibrating Objects with Translation
Authors: Kohei Shimasaki, Tomoaki Okamura, Idaku Ishii
Abstract:
We propose a vibration imaging method for high frame rate (HFR)-video-based localization of vibrating objects with large translations. When the ratio of the translation speed of a target to its vibration frequency is large, obtaining its frequency response in image intensities becomes difficult because one or no waves are observable at the same pixel. Our method can precisely localize moving objects with vibration by virtually translating multiple image sequences for pixel-level short-time Fourier transform to observe multiple waves at the same pixel. The effectiveness of the proposed method is demonstrated by analyzing several HFR videos of flying insects in real scenarios.Keywords: HFR video analysis, pixel-level vibration source localization, short-time Fourier transform, virtual translation
Procedia PDF Downloads 1081684 A New Direction of Urban Regeneration: Form-Based Urban Reconstruction through the Idea of Bricolage
Authors: Hyejin Song, Jin Baek
Abstract:
Based on the idea of bricolage that a new meaning beyond that of each of objects can be created through combination and juxtaposition of various objets, this study finds a way of morphological-recomposing of urban space through combination and juxtaposition of existing urban fabric and new fabric and suggests this idea as new direction of urban regeneration. This study sets concept of bricolage as a philosophical ground of interpreting contemporary urban situation. In this concept, urban objects such as buildings from various zeitgeists are positively considered as potential textures which can construct meaningful context. Seoul, as the city having long history and experiencing colonization and development, appears dynamic urban structure full of various objects from various periods. However, in contrast with successful plazas and streets in Europe, objects in Seoul do not make a meaningful context as public space due to thoughtless development. This study defines this situation as ‘disorgnized-fabric’. Following the concept of bricolage, to find the way for those existing scattered objects to be organized as a context of meaningful public space, this study firstly researches the case of successful public space by morphological analysis. Secondly, this study carefully explores urban space in Seoul, and draws figure-ground diagram to grasp the form of current urban fabric by various urban-objects. As a result of exploration, a lot of urban spaces from Myeong-dong, one of vibrant commercial district in Seoul, to declining residential area are judged as having potential fabric which can become meaningful context by just small adjustment of relationship between existing objects. This study also confirmed that by inserting a new object with consideration of form of existing fabric, it is possible to accord a new context as plaza to existing void which have broken as several parts. This study defines it as form-based urban reconstruction through the idea of bricolage, and suggests that it could be one of philosophical ground of successful urban regeneration.Keywords: adjustment of relationship between existing objets, bricolage, morphological analysis of urban fabric, urban regeneration, urban reconstruction
Procedia PDF Downloads 3181683 Analysis of Detection Concealed Objects Based on Multispectral and Hyperspectral Signatures
Authors: M. Kastek, M. Kowalski, M. Szustakowski, H. Polakowski, T. Sosnowski
Abstract:
Development of highly efficient security systems is one of the most urgent topics for science and engineering. There are many kinds of threats and many methods of prevention. It is very important to detect a threat as early as possible in order to neutralize it. One of the very challenging problems is detection of dangerous objects hidden under human’s clothing. This problem is particularly important for safety of airport passengers. In order to develop methods and algorithms to detect hidden objects it is necessary to determine the thermal signatures of such objects of interest. The laboratory measurements were conducted to determine the thermal signatures of dangerous tools hidden under various clothes in different ambient conditions. Cameras used for measurements were working in spectral range 0.6-12.5 μm An infrared imaging Fourier transform spectroradiometer was also used, working in spectral range 7.7-11.7 μm. Analysis of registered thermograms and hyperspectral datacubes has yielded the thermal signatures for two types of guns, two types of knives and home-made explosive bombs. The determined thermal signatures will be used in the development of method and algorithms of image analysis implemented in proposed monitoring systems.Keywords: hyperspectral detection, nultispectral detection, image processing, monitoring systems
Procedia PDF Downloads 3481682 Efficient Principal Components Estimation of Large Factor Models
Authors: Rachida Ouysse
Abstract:
This paper proposes a constrained principal components (CnPC) estimator for efficient estimation of large-dimensional factor models when errors are cross sectionally correlated and the number of cross-sections (N) may be larger than the number of observations (T). Although principal components (PC) method is consistent for any path of the panel dimensions, it is inefficient as the errors are treated to be homoskedastic and uncorrelated. The new CnPC exploits the assumption of bounded cross-sectional dependence, which defines Chamberlain and Rothschild’s (1983) approximate factor structure, as an explicit constraint and solves a constrained PC problem. The CnPC method is computationally equivalent to the PC method applied to a regularized form of the data covariance matrix. Unlike maximum likelihood type methods, the CnPC method does not require inverting a large covariance matrix and thus is valid for panels with N ≥ T. The paper derives a convergence rate and an asymptotic normality result for the CnPC estimators of the common factors. We provide feasible estimators and show in a simulation study that they are more accurate than the PC estimator, especially for panels with N larger than T, and the generalized PC type estimators, especially for panels with N almost as large as T.Keywords: high dimensionality, unknown factors, principal components, cross-sectional correlation, shrinkage regression, regularization, pseudo-out-of-sample forecasting
Procedia PDF Downloads 1501681 Towards Interconnectedness: A Study of Collaborative School Culture and Principal Curriculum Leadership
Authors: Fan Chih-Wen
Abstract:
The Ministry of Education (2014) released the 12-year National Basic Education Curriculum Syllabus. Curriculum implementation has evolved from a loose connection of cooperation to a closely structured relationship of coordination and collaboration. Collaboration opens the door to teachers' culture of isolation and classrooms and allows them to discuss educational issues from multiple perspectives and achieve shared goals. The purpose of study is to investigate facilitating factors of collaborative school culture and implications for principal curriculum leadership. The development and implementation of the new curriculum involves collaborative governance across systems and levels, including cooperation between central governments and schools. First, it analyzes the connotation of the 12-year National Basic Education Curriculum; Second, it analyzes the meaning of collaborative culture; Third, it analyzes the motivating factors of collaborative culture. Finally, based on this, it puts forward relevant suggestions for principal curriculum leadership.Keywords: curriculum leadership, collaboration culture, tracher culture, school improvement
Procedia PDF Downloads 221680 Quantitative Ranking Evaluation of Wine Quality
Authors: A. Brunel, A. Kernevez, F. Leclere, J. Trenteseaux
Abstract:
Today, wine quality is only evaluated by wine experts with their own different personal tastes, even if they may agree on some common features. So producers do not have any unbiased way to independently assess the quality of their products. A tool is here proposed to evaluate wine quality by an objective ranking based upon the variables entering wine elaboration, and analysed through principal component analysis (PCA) method. Actual climatic data are compared by measuring the relative distance between each considered wine, out of which the general ranking is performed.Keywords: wine, grape, weather conditions, rating, climate, principal component analysis, metric analysis
Procedia PDF Downloads 3181679 Application of Principal Component Analysis and Ordered Logit Model in Diabetic Kidney Disease Progression in People with Type 2 Diabetes
Authors: Mequanent Wale Mekonen, Edoardo Otranto, Angela Alibrandi
Abstract:
Diabetic kidney disease is one of the main microvascular complications caused by diabetes. Several clinical and biochemical variables are reported to be associated with diabetic kidney disease in people with type 2 diabetes. However, their interrelations could distort the effect estimation of these variables for the disease's progression. The objective of the study is to determine how the biochemical and clinical variables in people with type 2 diabetes are interrelated with each other and their effects on kidney disease progression through advanced statistical methods. First, principal component analysis was used to explore how the biochemical and clinical variables intercorrelate with each other, which helped us reduce a set of correlated biochemical variables to a smaller number of uncorrelated variables. Then, ordered logit regression models (cumulative, stage, and adjacent) were employed to assess the effect of biochemical and clinical variables on the order-level response variable (progression of kidney function) by considering the proportionality assumption for more robust effect estimation. This retrospective cross-sectional study retrieved data from a type 2 diabetic cohort in a polyclinic hospital at the University of Messina, Italy. The principal component analysis yielded three uncorrelated components. These are principal component 1, with negative loading of glycosylated haemoglobin, glycemia, and creatinine; principal component 2, with negative loading of total cholesterol and low-density lipoprotein; and principal component 3, with negative loading of high-density lipoprotein and a positive load of triglycerides. The ordered logit models (cumulative, stage, and adjacent) showed that the first component (glycosylated haemoglobin, glycemia, and creatinine) had a significant effect on the progression of kidney disease. For instance, the cumulative odds model indicated that the first principal component (linear combination of glycosylated haemoglobin, glycemia, and creatinine) had a strong and significant effect on the progression of kidney disease, with an effect or odds ratio of 0.423 (P value = 0.000). However, this effect was inconsistent across levels of kidney disease because the first principal component did not meet the proportionality assumption. To address the proportionality problem and provide robust effect estimates, alternative ordered logit models, such as the partial cumulative odds model, the partial adjacent category model, and the partial continuation ratio model, were used. These models suggested that clinical variables such as age, sex, body mass index, medication (metformin), and biochemical variables such as glycosylated haemoglobin, glycemia, and creatinine have a significant effect on the progression of kidney disease.Keywords: diabetic kidney disease, ordered logit model, principal component analysis, type 2 diabetes
Procedia PDF Downloads 391678 Modeling Factors Affecting Fertility Transition in Africa: Case of Kenya
Authors: Dennis Okora Amima Ondieki
Abstract:
Fertility transition has been identified to be affected by numerous factors. This research aimed to investigate the most real factors affecting fertility transition in Kenya. These factors were firstly extracted from the literature convened into demographic features, social, and economic features, social-cultural features, reproductive features and modernization features. All these factors had 23 factors identified for this study. The data for this study was from the Kenya Demographic and Health Surveys (KDHS) conducted in 1999-2003 and 2003-2008/9. The data was continuous, and it involved the mean birth order for the ten periods. Principal component analysis (PCA) was utilized using 23 factors. Principal component analysis conveyed religion, region, education and marital status as the real factors. PC scores were calculated for every point. The identified principal components were utilized as forecasters in the multiple regression model, with the fertility level as the response variable. The four components were found to be affecting fertility transition differently. It was found that fertility is affected positively by factors of region and marital and negatively by factors of religion and education. These four factors can be considered in the planning policy in Kenya and Africa at large.Keywords: fertility transition, principal component analysis, Kenya demographic health survey, birth order
Procedia PDF Downloads 1001677 Hierarchical Clustering Algorithms in Data Mining
Authors: Z. Abdullah, A. R. Hamdan
Abstract:
Clustering is a process of grouping objects and data into groups of clusters to ensure that data objects from the same cluster are identical to each other. Clustering algorithms in one of the areas in data mining and it can be classified into partition, hierarchical, density based, and grid-based. Therefore, in this paper, we do a survey and review for four major hierarchical clustering algorithms called CURE, ROCK, CHAMELEON, and BIRCH. The obtained state of the art of these algorithms will help in eliminating the current problems, as well as deriving more robust and scalable algorithms for clustering.Keywords: clustering, unsupervised learning, algorithms, hierarchical
Procedia PDF Downloads 8851676 Detection of Cardiac Arrhythmia Using Principal Component Analysis and Xgboost Model
Authors: Sujay Kotwale, Ramasubba Reddy M.
Abstract:
Electrocardiogram (ECG) is a non-invasive technique used to study and analyze various heart diseases. Cardiac arrhythmia is a serious heart disease which leads to death of the patients, when left untreated. An early-time detection of cardiac arrhythmia would help the doctors to do proper treatment of the heart. In the past, various algorithms and machine learning (ML) models were used to early-time detection of cardiac arrhythmia, but few of them have achieved better results. In order to improve the performance, this paper implements principal component analysis (PCA) along with XGBoost model. The PCA was implemented to the raw ECG signals which suppress redundancy information and extracted significant features. The obtained significant ECG features were fed into XGBoost model and the performance of the model was evaluated. In order to valid the proposed technique, raw ECG signals obtained from standard MIT-BIH database were employed for the analysis. The result shows that the performance of proposed method is superior to the several state-of-the-arts techniques.Keywords: cardiac arrhythmia, electrocardiogram, principal component analysis, XGBoost
Procedia PDF Downloads 1191675 Combining an Optimized Closed Principal Curve-Based Method and Evolutionary Neural Network for Ultrasound Prostate Segmentation
Authors: Tao Peng, Jing Zhao, Yanqing Xu, Jing Cai
Abstract:
Due to missing/ambiguous boundaries between the prostate and neighboring structures, the presence of shadow artifacts, as well as the large variability in prostate shapes, ultrasound prostate segmentation is challenging. To handle these issues, this paper develops a hybrid method for ultrasound prostate segmentation by combining an optimized closed principal curve-based method and the evolutionary neural network; the former can fit curves with great curvature and generate a contour composed of line segments connected by sorted vertices, and the latter is used to express an appropriate map function (represented by parameters of evolutionary neural network) for generating the smooth prostate contour to match the ground truth contour. Both qualitative and quantitative experimental results showed that our proposed method obtains accurate and robust performances.Keywords: ultrasound prostate segmentation, optimized closed polygonal segment method, evolutionary neural network, smooth mathematical model, principal curve
Procedia PDF Downloads 2021674 Anisotropic Shear Strength of Sand Containing Plastic Fine Materials
Authors: Alaa H. J. Al-Rkaby, A. Chegenizadeh, H. R. Nikraz
Abstract:
Anisotropy is one of the major aspects that affect soil behavior, and extensive efforts have investigated its effect on the mechanical properties of soil. However, very little attention has been given to the combined effect of anisotropy and fine contents. Therefore, in this paper, the anisotropic strength of sand containing different fine content (F) of 5%, 10%, 15%, and 20%, was investigated using hollow cylinder tests under different principal stress directions of α = 0° and α = 90°. For a given principal stress direction (α), it was found that increasing fine content resulted in decreasing deviator stress (q). Moreover, results revealed that all fine contents showed anisotropic strength where there is a clear difference between the strength under 0° and the strength under 90°. This anisotropy was greatest under F = 5% while it decreased with increasing fine contents, particularly at F = 10%. Mixtures with low fine content show low contractive behavior and tended to show more dilation. Moreover, all sand-clay mixtures exhibited less dilation and more compression at α = 90° compared with that at α = 0°.Keywords: anisotropy, principal stress direction, fine content, hollow cylinder sample
Procedia PDF Downloads 3121673 Elastic Constants of Fir Wood Using Ultrasound and Compression Tests
Authors: Ergun Guntekin
Abstract:
Elastic constants of Fir wood (Abies cilicica) have been investigated by means of ultrasound and compression tests. Three modulus of elasticity in principal directions (EL, ER, ET), six Poisson’s ratios (ʋLR, ʋLT, ʋRT, ʋTR, ʋRL, ʋTL) and three shear modules (GLR, GRT, GLT) were determined. 20 x 20 x 60 mm samples were conditioned at 65 % relative humidity and 20ºC before testing. Three longitudinal and six shear wave velocities propagating along the principal axes of anisotropy, and additionally, three quasi-shear wave velocities at 45° angle with respect to the principal axes of anisotropy were measured. 2.27 MHz longitudinal and 1 MHz shear sensors were used for obtaining sound velocities. Stress-strain curves of the samples in compression tests were obtained using bi-axial extensometer in order to calculate elastic constants. Test results indicated that most of the elastic constants determined in the study are within the acceptable range. Although elastic constants determined from ultrasound are usually higher than those determined from compression tests, the values of EL and GLR determined from compression tests were higher in the study. The results of this study can be used in the numerical modeling of elements or systems under load using Fir wood.Keywords: compression tests, elastic constants, fir wood, ultrasound
Procedia PDF Downloads 2171672 A Novel Rapid Well Control Technique Modelled in Computational Fluid Dynamics Software
Authors: Michael Williams
Abstract:
The ability to control a flowing well is of the utmost important. During the kill phase, heavy weight kill mud is circulated around the well. While increasing bottom hole pressure near wellbore formation, the damage is increased. The addition of high density spherical objects has the potential to minimise this near wellbore damage, increase bottom hole pressure and reduce operational time to kill the well. This operational time saving is seen in the rapid deployment of high density spherical objects instead of building high density drilling fluid. The research aims to model the well kill process using a Computational Fluid Dynamics software. A model has been created as a proof of concept to analyse the flow of micron sized spherical objects in the drilling fluid. Initial results show that this new methodology of spherical objects in drilling fluid agrees with traditional stream lines seen in non-particle flow. Additional models have been created to demonstrate that areas of higher flow rate around the bit can lead to increased probability of wash out of formations but do not affect the flow of micron sized spherical objects. Interestingly, areas that experience dimensional changes such as tool joints and various BHA components do not appear at this initial stage to experience increased velocity or create areas of turbulent flow, which could lead to further borehole stability. In conclusion, the initial models of this novel well control methodology have not demonstrated any adverse flow patterns, which would conclude that this model may be viable under field conditions.Keywords: well control, fluid mechanics, safety, environment
Procedia PDF Downloads 1711671 Robust and Real-Time Traffic Counting System
Authors: Hossam M. Moftah, Aboul Ella Hassanien
Abstract:
In the recent years the importance of automatic traffic control has increased due to the traffic jams problem especially in big cities for signal control and efficient traffic management. Traffic counting as a kind of traffic control is important to know the road traffic density in real time. This paper presents a fast and robust traffic counting system using different image processing techniques. The proposed system is composed of the following four fundamental building phases: image acquisition, pre-processing, object detection, and finally counting the connected objects. The object detection phase is comprised of the following five steps: subtracting the background, converting the image to binary, closing gaps and connecting nearby blobs, image smoothing to remove noises and very small objects, and detecting the connected objects. Experimental results show the great success of the proposed approach.Keywords: traffic counting, traffic management, image processing, object detection, computer vision
Procedia PDF Downloads 2941670 Composing Method of Decision-Making Function for Construction Management Using Active 4D/5D/6D Objects
Authors: Hyeon-Seung Kim, Sang-Mi Park, Sun-Ju Han, Leen-Seok Kang
Abstract:
As BIM (Building Information Modeling) application continually expands, the visual simulation techniques used for facility design and construction process information are becoming increasingly advanced and diverse. For building structures, BIM application is design - oriented to utilize 3D objects for conflict management, whereas for civil engineering structures, the usability of nD object - oriented construction stage simulation is important in construction management. Simulations of 5D and 6D objects, for which cost and resources are linked along with process simulation in 4D objects, are commonly used, but they do not provide a decision - making function for process management problems that occur on site because they mostly focus on the visual representation of current status for process information. In this study, an nD CAD system is constructed that facilitates an optimized schedule simulation that minimizes process conflict, a construction duration reduction simulation according to execution progress status, optimized process plan simulation according to project cost change by year, and optimized resource simulation for field resource mobilization capability. Through this system, the usability of conventional simple simulation objects is expanded to the usability of active simulation objects with which decision - making is possible. Furthermore, to close the gap between field process situations and planned 4D process objects, a technique is developed to facilitate a comparative simulation through the coordinated synchronization of an actual video object acquired by an on - site web camera and VR concept 4D object. This synchronization and simulation technique can also be applied to smartphone video objects captured in the field in order to increase the usability of the 4D object. Because yearly project costs change frequently for civil engineering construction, an annual process plan should be recomposed appropriately according to project cost decreases/increases compared with the plan. In the 5D CAD system provided in this study, an active 5D object utilization concept is introduced to perform a simulation in an optimized process planning state by finding a process optimized for the changed project cost without changing the construction duration through a technique such as genetic algorithm. Furthermore, in resource management, an active 6D object utilization function is introduced that can analyze and simulate an optimized process plan within a possible scope of moving resources by considering those resources that can be moved under a given field condition, instead of using a simple resource change simulation by schedule. The introduction of an active BIM function is expected to increase the field utilization of conventional nD objects.Keywords: 4D, 5D, 6D, active BIM
Procedia PDF Downloads 2761669 A Pervasive System Architecture for Smart Environments in Internet of Things Context
Authors: Patrick Santos, João Casal, João Santos Luis Varandas, Tiago Alves, Carlos Romeiro, Sérgio Lourenço
Abstract:
Nowadays, technology makes it possible to, in one hand, communicate with various objects of the daily life through the Internet, and in the other, put these objects interacting with each other through this channel. Simultaneously, with the raise of smartphones as the most ubiquitous technology on persons lives, emerge new agents for these devices - Intelligent Personal Assistants. These agents have the goal of helping the user manage and organize his information as well as supporting the user in his/her day-to-day tasks. Moreover, other emergent concept is the Cloud Computing, which allows computation and storage to get out of the users devices, bringing benefits in terms of performance, security, interoperability and others. Connecting these three paradigms, in this work we propose an architecture for an intelligent system which provides an interface that assists the user on smart environments, informing, suggesting actions and allowing to manage the objects of his/her daily life.Keywords: internet of things, cloud, intelligent personal assistant, architecture
Procedia PDF Downloads 5141668 Slave Museums and a Site of Democratic Pedagogy: Engagement, Healing and Tolerance
Authors: Elaine Stavro
Abstract:
In our present world where acts of incivility, intolerance and anger towards minority communities is on the rise, the ways museum practices cultivate ethical generosity is of interest. Democratic theorists differ as to how they believe respect can be generated through active participation. Allowing minority communities a role in determining what artifacts will be displayed and how they will be displayed has been an important step in generating respect. In addition, the rise of indigenous museums, slave museums and curators who represent these communities, contribute to the communication of their history of oppression. These institutional practices have been supplemented by the handling of objects, recognition stories and multisensory exhibitions. Psychoanalysis, object relations theorists believe that the handling of objects: amenable objects and responsive listeners will trigger the expression of anomie, alienation and traumatizing experiences. Not only memorializing but engaging with one’s lose in a very personal way can facilitate the process of mourning. Manchester Museum (UK) gathered together Somalian refugees, who in the process of handling their own objects and those offered at the museum, began to tell their stories. Democratic theorists (especially affect theorists or vital materialists or Actor Network theorists) believe that things can be social actants- material objects have agentic capacities that humans should align with. In doing so, they challenge social constructivism that attributes power to interpreted things, but like them they assume an openness or responsiveness to Otherness can be cultivated. Rich sensory experiences, corporeal engagement (devices that involve bodily movement or objects that involve handling) auditory experiences (songs) all contribute to improve one’s responsiveness and openness to Others. This paper will focus specifically on slave museums/ and exhibits in the U.K, the USA., South Africa to explore and evaluate their democratic strategies in cultivating tolerant practices via the various democratic avenues outlined above.Keywords: democratic pedagogy, slave exhibitions, affect/emotion, object handling
Procedia PDF Downloads 4601667 Autonomous Kuka Youbot Navigation Based on Machine Learning and Path Planning
Authors: Carlos Gordon, Patricio Encalada, Henry Lema, Diego Leon, Dennis Chicaiza
Abstract:
The following work presents a proposal of autonomous navigation of mobile robots implemented in an omnidirectional robot Kuka Youbot. We have been able to perform the integration of robotic operative system (ROS) and machine learning algorithms. ROS mainly provides two distributions; ROS hydro and ROS Kinect. ROS hydro allows managing the nodes of odometry, kinematics, and path planning with statistical and probabilistic, global and local algorithms based on Adaptive Monte Carlo Localization (AMCL) and Dijkstra. Meanwhile, ROS Kinect is responsible for the detection block of dynamic objects which can be in the points of the planned trajectory obstructing the path of Kuka Youbot. The detection is managed by artificial vision module under a trained neural network based on the single shot multibox detector system (SSD), where the main dynamic objects for detection are human beings and domestic animals among other objects. When the objects are detected, the system modifies the trajectory or wait for the decision of the dynamic obstacle. Finally, the obstacles are skipped from the planned trajectory, and the Kuka Youbot can reach its goal thanks to the machine learning algorithms.Keywords: autonomous navigation, machine learning, path planning, robotic operative system, open source computer vision library
Procedia PDF Downloads 1771666 Principal Component Regression in Amylose Content on the Malaysian Market Rice Grains Using Near Infrared Reflectance Spectroscopy
Authors: Syahira Ibrahim, Herlina Abdul Rahim
Abstract:
The amylose content is an essential element in determining the texture and taste of rice grains. This paper evaluates the use of VIS-SWNIRS in estimating the amylose content for seven varieties of rice grains available in the Malaysian market. Each type consists of 30 samples and all the samples are scanned using the spectroscopy to obtain a range of values between 680-1000nm. The Savitzky-Golay (SG) smoothing filter is applied to each sample’s data before the Principal Component Regression (PCR) technique is used to examine the data and produce a single value for each sample. This value is then compared with reference values obtained from the standard iodine colorimetric test in terms of its coefficient of determination, R2. Results show that this technique produced low R2 values of less than 0.50. In order to improve the result, the range should include a wavelength range of 1100-2500nm and the number of samples processed should also be increased.Keywords: amylose content, diffuse reflectance, Malaysia rice grain, principal component regression (PCR), Visible and Shortwave near-infrared spectroscopy (VIS-SWNIRS)
Procedia PDF Downloads 3821665 Scientific Theoretical Fundamentals of Comparative Analysis
Authors: Khalliyeva Gulnoz Iskandarovna, Mannonova Feruzabonu Sherali Qizi
Abstract:
A scientific field called comparative literature or literary comparative studies compares two or more literary phenomena. One of the most important scientific fields nowadays, when global social, cultural, and literary relations are growing daily, is comparative literature. Any comparative investigation reveals shared and unique characteristics of literary phenomena, which provide the cornerstone for the creation of overarching theoretical principles that apply to all literature. Comparative analysis consists of objects, and they are their constituents. For researchers, it is enough to know this. Comparative analysis, in addition to the above-mentioned actions, also focuses on comparing the components of the objects of analysis with each other. The purpose of this article is to investigate comparative analysis in literature and to identify similarities and differences between comparable objects. Students, teachers, and researchers should be able to describe comparative research techniques and their fundamental ideas when studying this topic. They should also have a basic understanding of comparative literature and their summary.Keywords: object, natural, social, spiritual, epistemological, logical, methodological, methodological, axiological tasks, stages of comparison, environment, internal features, and typical situations
Procedia PDF Downloads 591664 Microbial Deterioration of Some Different Archaeological Objects Made from Cellulose by Bacillus Group
Authors: Mohammad Abdel Fattah Mohammad Kewisha
Abstract:
Microbial deterioration of ancient materials became one of the biggest problems facing the workers in the field of cultural heritage protection because the microbial deterioration of artifacts causes detrimental effects on the aesthetic value of the monuments due to colonization, whether they are made of inorganic materials such as stone or organic like wood, textiles, wall paintings, and paper. So, the early identification of the bacterial strains that caused deterioration is the most important point for the protection of monument objects. The present study focuses on the Bacillus spp. group, which was isolated from some biodeterioration monuments from different areas of Egypt. The investigated objects in this study were made from organic materials (cellulose), paper, textile, and wood. Isolated strains were identified up to the species level biochemically. Eleven bacterial isolates were obtained from collected samples. They were taken from different archaeological objects, four microbicides, cetrimonium bromide, sodium azide, tetraethyl ammonium bromide, and dichloroxylenol, at various concentrations ranging from 25 ppm to 500 ppm. They were screened for their antibacterial activity against the Bacillus spp. isolates, and detection of Minimum inhibitory concentration (MIC). It was also necessary to indicate the ideal Minimum inhibitory concentration for each strain for the purpose of biotreatment of the infected monuments with less damaging effect on monument materials.Keywords: microbial deterioration, ancient materials, heritage protection, protection of monuments, biodeteriorative monuments
Procedia PDF Downloads 601663 Variability Studies of Seyfert Galaxies Using Sloan Digital Sky Survey and Wide-Field Infrared Survey Explorer Observations
Authors: Ayesha Anjum, Arbaz Basha
Abstract:
Active Galactic Nuclei (AGN) are the actively accreting centers of the galaxies that host supermassive black holes. AGN emits radiation in all wavelengths and also shows variability across all the wavelength bands. The analysis of flux variability tells us about the morphology of the site of emission radiation. Some of the major classifications of AGN are (a) Blazars, with featureless spectra. They are subclassified as BLLacertae objects, Flat Spectrum Radio Quasars (FSRQs), and others; (b) Seyferts with prominent emission line features are classified into Broad Line, Narrow Line Seyferts of Type 1 and Type 2 (c) quasars, and other types. Sloan Digital Sky Survey (SDSS) is an optical telescope based in Mexico that has observed and classified billions of objects based on automated photometric and spectroscopic methods. A sample of blazars is obtained from the third Fermi catalog. For variability analysis, we searched for light curves for these objects in Wide-Field Infrared Survey Explorer (WISE) and Near Earth Orbit WISE (NEOWISE) in two bands: W1 (3.4 microns) and W2 (4.6 microns), reducing the final sample to 256 objects. These objects are also classified into 155 BLLacs, 99 FSRQs, and 2 Narrow Line Seyferts, namely, PMNJ0948+0022 and PKS1502+036. Mid-infrared variability studies of these objects would be a contribution to the literature. With this as motivation, the present work is focused on studying a final sample of 256 objects in general and the Seyferts in particular. Owing to the fact that the classification is automated, SDSS has miclassified these objects into quasars, galaxies, and stars. Reasons for the misclassification are explained in this work. The variability analysis of these objects is done using the method of flux amplitude variability and excess variance. The sample consists of observations in both W1 and W2 bands. PMN J0948+0022 is observed between MJD from 57154.79 to 58810.57. PKS 1502+036 is observed between MJD from 57232.42 to 58517.11, which amounts to a period of over six years. The data is divided into different epochs spanning not more than 1.2 days. In all the epochs, the sources are found to be variable in both W1 and W2 bands. This confirms that the object is variable in mid-infrared wavebands in both long and short timescales. Also, the sources are observed for color variability. Objects either show a bluer when brighter trend (BWB) or a redder when brighter trend (RWB). The possible claim for the object to be BWB (present objects) is that the longer wavelength radiation emitted by the source can be suppressed by the high-energy radiation from the central source. Another result is that the smallest radius of the emission source is one day since the epoch span used in this work is one day. The mass of the black holes at the centers of these sources is found to be less than or equal to 108 solar masses, respectively.Keywords: active galaxies, variability, Seyfert galaxies, SDSS, WISE
Procedia PDF Downloads 1291662 Numerical Implementation and Testing of Fractioning Estimator Method for the Box-Counting Dimension of Fractal Objects
Authors: Abraham Terán Salcedo, Didier Samayoa Ochoa
Abstract:
This work presents a numerical implementation of a method for estimating the box-counting dimension of self-avoiding curves on a planar space, fractal objects captured on digital images; this method is named fractioning estimator. Classical methods of digital image processing, such as noise filtering, contrast manipulation, and thresholding, among others, are used in order to obtain binary images that are suitable for performing the necessary computations of the fractioning estimator. A user interface is developed for performing the image processing operations and testing the fractioning estimator on different captured images of real-life fractal objects. To analyze the results, the estimations obtained through the fractioning estimator are compared to the results obtained through other methods that are already implemented on different available software for computing and estimating the box-counting dimension.Keywords: box-counting, digital image processing, fractal dimension, numerical method
Procedia PDF Downloads 831661 3D Text Toys: Creative Approach to Experiential and Immersive Learning for World Literacy
Authors: Azyz Sharafy
Abstract:
3D Text Toys is an innovative and creative approach that utilizes 3D text objects to enhance creativity, literacy, and basic learning in an enjoyable and gamified manner. By using 3D Text Toys, children can develop their creativity, visually learn words and texts, and apply their artistic talents within their creative abilities. This process incorporates haptic engagement with 2D and 3D texts, word building, and mechanical construction of everyday objects, thereby facilitating better word and text retention. The concept involves constructing visual objects made entirely out of 3D text/words, where each component of the object represents a word or text element. For instance, a bird can be recreated using words or text shaped like its wings, beak, legs, head, and body, resulting in a 3D representation of the bird purely composed of text. This can serve as an art piece or a learning tool in the form of a 3D text toy. These 3D text objects or toys can be crafted using natural materials such as leaves, twigs, strings, or ropes, or they can be made from various physical materials using traditional crafting tools. Digital versions of these objects can be created using 2D or 3D software on devices like phones, laptops, iPads, or computers. To transform digital designs into physical objects, computerized machines such as CNC routers, laser cutters, and 3D printers can be utilized. Once the parts are printed or cut out, students can assemble the 3D texts by gluing them together, resulting in natural or everyday 3D text objects. These objects can be painted to create artistic pieces or text toys, and the addition of wheels can transform them into moving toys. One of the significant advantages of this visual and creative object-based learning process is that students not only learn words but also derive enjoyment from the process of creating, painting, and playing with these objects. The ownership and creation process further enhances comprehension and word retention. Moreover, for individuals with learning disabilities such as dyslexia, ADD (Attention Deficit Disorder), or other learning difficulties, the visual and haptic approach of 3D Text Toys can serve as an additional creative and personalized learning aid. The application of 3D Text Toys extends to both the English language and any other global written language. The adaptation and creative application may vary depending on the country, space, and native written language. Furthermore, the implementation of this visual and haptic learning tool can be tailored to teach foreign languages based on age level and comprehension requirements. In summary, this creative, haptic, and visual approach has the potential to serve as a global literacy tool.Keywords: 3D text toys, creative, artistic, visual learning for world literacy
Procedia PDF Downloads 641660 Video Foreground Detection Based on Adaptive Mixture Gaussian Model for Video Surveillance Systems
Authors: M. A. Alavianmehr, A. Tashk, A. Sodagaran
Abstract:
Modeling background and moving objects are significant techniques for video surveillance and other video processing applications. This paper presents a foreground detection algorithm that is robust against illumination changes and noise based on adaptive mixture Gaussian model (GMM), and provides a novel and practical choice for intelligent video surveillance systems using static cameras. In the previous methods, the image of still objects (background image) is not significant. On the contrary, this method is based on forming a meticulous background image and exploiting it for separating moving objects from their background. The background image is specified either manually, by taking an image without vehicles, or is detected in real-time by forming a mathematical or exponential average of successive images. The proposed scheme can offer low image degradation. The simulation results demonstrate high degree of performance for the proposed method.Keywords: image processing, background models, video surveillance, foreground detection, Gaussian mixture model
Procedia PDF Downloads 5161659 The Effect of Object Presentation on Action Memory in School-Aged Children
Authors: Farzaneh Badinlou, Reza Kormi-Nouri, Monika Knopf
Abstract:
Enacted tasks are typically remembered better than when the same task materials are only verbally encoded, a robust finding referred to as the enactment effect. It has been assumed that enactment effect is independent of object presence but the size of enactment effect can be increased by providing objects at study phase in adults. To clarify the issues in children, free recall and cued recall performance of action phrases with or without using real objects were compared in 410 school-aged children from four age groups (8, 10, 12 and 14 years old). In this study, subjects were instructed to learn a series of action phrases under three encoding conditions, participants listened to verbal action phrases (VTs), performed the phrases (SPTs: subject-performed tasks), and observed the experimenter perform the phrases (EPTs: experimenter-performed tasks). Then, free recall and cued recall memory tests were administrated. The results revealed that the real object compared with imaginary objects improved recall performance in SPTs and EPTs, but more so in VTs. It was also found that the object presence was not necessary for the occurrence of the enactment effect but it was changed the size of enactment effect in all age groups. The size of enactment effect was more pronounced for imaginary objects than the real object in both free recall and cued recall memory tests in children. It was discussed that SPTs and EPTs deferentially facilitate item-specific and relation information processing and providing the objects can moderate the processing underlying the encoding conditions.Keywords: action memory, enactment effect, item-specific processing, object, relational processing, school-aged children
Procedia PDF Downloads 2381658 Fuzzy-Machine Learning Models for the Prediction of Fire Outbreak: A Comparative Analysis
Authors: Uduak Umoh, Imo Eyoh, Emmauel Nyoho
Abstract:
This paper compares fuzzy-machine learning algorithms such as Support Vector Machine (SVM), and K-Nearest Neighbor (KNN) for the predicting cases of fire outbreak. The paper uses the fire outbreak dataset with three features (Temperature, Smoke, and Flame). The data is pre-processed using Interval Type-2 Fuzzy Logic (IT2FL) algorithm. Min-Max Normalization and Principal Component Analysis (PCA) are used to predict feature labels in the dataset, normalize the dataset, and select relevant features respectively. The output of the pre-processing is a dataset with two principal components (PC1 and PC2). The pre-processed dataset is then used in the training of the aforementioned machine learning models. K-fold (with K=10) cross-validation method is used to evaluate the performance of the models using the matrices – ROC (Receiver Operating Curve), Specificity, and Sensitivity. The model is also tested with 20% of the dataset. The validation result shows KNN is the better model for fire outbreak detection with an ROC value of 0.99878, followed by SVM with an ROC value of 0.99753.Keywords: Machine Learning Algorithms , Interval Type-2 Fuzzy Logic, Fire Outbreak, Support Vector Machine, K-Nearest Neighbour, Principal Component Analysis
Procedia PDF Downloads 182