Search results for: data mining applications and discovery
25883 Consumer Values in the Perspective of Javanese Mataraman Society: Identification, Meaning, and Application
Authors: Anna Triwijayati, Etsa Astridya Setiyati, Titik Desi Harsoyo
Abstract:
Culture is the important determinant of human behavior and desire. Culture influences the consumer through the norms and values established by the society in which they live and reflect it. The cultural values of Javanese society certainly have united in the Javanese society behavior in consumption. This research is expected to give big enough theoretical benefits in the findings of cultural value in consumption in Javanese society. These can be an incentive in finding the local cultural value in many tribes in Indonesia, so one time, the local cultural value in Indonesia about consumption can be fundamental part in education and consumption practice in Indonesia. The approach used in this research is non positivist research or is known as qualitative approach. The method or type of research used in this research is ethnomethodology. The collection data is done in Central Java region. The research subject or informant is determined by the purposive technique by certain criteria determined by the researcher. The data is collected by deep interview and observation. Before the data analysis, the researcher does the storing method data stage and implements the data validity procedures. Then, the data is analyzed by the theme and interactive analysis technique. The Javanese Mataraman society has such consumption values such as has to be sufficient, be careful, economical, submit to the one who creates the life, the way life flow, and the present problem is thought in the present also. In the financial management for consumption, the consumer should have the simple life principles, has to be sufficient, has to be able to eat, has to be able to self-press, well-managed/diligent/accurate/careful, the open or transparent management, has the struggle effort, like to self-sacrifice and think about the future. The meaning of consumption value in family is centered to the submission and full-trust to God. These consumption values are applied in consumer behavior in self, family, investment and credit need in short term and long term perspective.Keywords: values, consumer, consumption, Javanese Mataraman, ethnomethodology
Procedia PDF Downloads 39525882 Parallel Fuzzy Rough Support Vector Machine for Data Classification in Cloud Environment
Authors: Arindam Chaudhuri
Abstract:
Classification of data has been actively used for most effective and efficient means of conveying knowledge and information to users. The prima face has always been upon techniques for extracting useful knowledge from data such that returns are maximized. With emergence of huge datasets the existing classification techniques often fail to produce desirable results. The challenge lies in analyzing and understanding characteristics of massive data sets by retrieving useful geometric and statistical patterns. We propose a supervised parallel fuzzy rough support vector machine (PFRSVM) for data classification in cloud environment. The classification is performed by PFRSVM using hyperbolic tangent kernel. The fuzzy rough set model takes care of sensitiveness of noisy samples and handles impreciseness in training samples bringing robustness to results. The membership function is function of center and radius of each class in feature space and is represented with kernel. It plays an important role towards sampling the decision surface. The success of PFRSVM is governed by choosing appropriate parameter values. The training samples are either linear or nonlinear separable. The different input points make unique contributions to decision surface. The algorithm is parallelized with a view to reduce training times. The system is built on support vector machine library using Hadoop implementation of MapReduce. The algorithm is tested on large data sets to check its feasibility and convergence. The performance of classifier is also assessed in terms of number of support vectors. The challenges encountered towards implementing big data classification in machine learning frameworks are also discussed. The experiments are done on the cloud environment available at University of Technology and Management, India. The results are illustrated for Gaussian RBF and Bayesian kernels. The effect of variability in prediction and generalization of PFRSVM is examined with respect to values of parameter C. It effectively resolves outliers’ effects, imbalance and overlapping class problems, normalizes to unseen data and relaxes dependency between features and labels. The average classification accuracy for PFRSVM is better than other classifiers for both Gaussian RBF and Bayesian kernels. The experimental results on both synthetic and real data sets clearly demonstrate the superiority of the proposed technique.Keywords: FRSVM, Hadoop, MapReduce, PFRSVM
Procedia PDF Downloads 49525881 Synthesis and Thermoluminescence Investigations of Doped LiF Nanophosphor
Authors: Pooja Seth, Shruti Aggarwal
Abstract:
Thermoluminescence dosimetry (TLD) is one of the most effective methods for the assessment of dose during diagnostic radiology and radiotherapy applications. In these applications monitoring of absorbed dose is essential to prevent patient from undue exposure and to evaluate the risks that may arise due to exposure. LiF based thermoluminescence (TL) dosimeters are promising materials for the estimation, calibration and monitoring of dose due to their favourable dosimetric characteristics like tissue-equivalence, high sensitivity, energy independence and dose linearity. As the TL efficiency of a phosphor strongly depends on the preparation route, it is interesting to investigate the TL properties of LiF based phosphor in nanocrystalline form. LiF doped with magnesium (Mg), copper (Cu), sodium (Na) and silicon (Si) in nanocrystalline form has been prepared using chemical co-precipitation method. Cubical shape LiF nanostructures are formed. TL dosimetry properties have been investigated by exposing it to gamma rays. TL glow curve structure of nanocrystalline form consists of a single peak at 419 K as compared to the multiple peaks observed in microcrystalline form. A consistent glow curve structure with maximum TL intensity at annealing temperature of 573 K and linear dose response from 0.1 to 1000 Gy is observed which is advantageous for radiotherapy application. Good reusability, low fading (5 % over a month) and negligible residual signal (0.0019%) are observed. As per photoluminescence measurements, wide emission band at 360 nm - 550 nm is observed in an undoped LiF. However, an intense peak at 488 nm is observed in doped LiF nanophosphor. The phosphor also exhibits the intense optically stimulated luminescence. Nanocrystalline LiF: Mg, Cu, Na, Si phosphor prepared by co-precipitation method showed simple glow curve structure, linear dose response, reproducibility, negligible residual signal, good thermal stability and low fading. The LiF: Mg, Cu, Na, Si phosphor in nanocrystalline form has tremendous potential in diagnostic radiology, radiotherapy and high energy radiation application.Keywords: thermoluminescence, nanophosphor, optically stimulated luminescence, co-precipitation method
Procedia PDF Downloads 40725880 Modified CUSUM Algorithm for Gradual Change Detection in a Time Series Data
Authors: Victoria Siriaki Jorry, I. S. Mbalawata, Hayong Shin
Abstract:
The main objective in a change detection problem is to develop algorithms for efficient detection of gradual and/or abrupt changes in the parameter distribution of a process or time series data. In this paper, we present a modified cumulative (MCUSUM) algorithm to detect the start and end of a time-varying linear drift in mean value of a time series data based on likelihood ratio test procedure. The design, implementation and performance of the proposed algorithm for a linear drift detection is evaluated and compared to the existing CUSUM algorithm using different performance measures. An approach to accurately approximate the threshold of the MCUSUM is also provided. Performance of the MCUSUM for gradual change-point detection is compared to that of standard cumulative sum (CUSUM) control chart designed for abrupt shift detection using Monte Carlo Simulations. In terms of the expected time for detection, the MCUSUM procedure is found to have a better performance than a standard CUSUM chart for detection of the gradual change in mean. The algorithm is then applied and tested to a randomly generated time series data with a gradual linear trend in mean to demonstrate its usefulness.Keywords: average run length, CUSUM control chart, gradual change detection, likelihood ratio test
Procedia PDF Downloads 30425879 Contextual Toxicity Detection with Data Augmentation
Authors: Julia Ive, Lucia Specia
Abstract:
Understanding and detecting toxicity is an important problem to support safer human interactions online. Our work focuses on the important problem of contextual toxicity detection, where automated classifiers are tasked with determining whether a short textual segment (usually a sentence) is toxic within its conversational context. We use “toxicity” as an umbrella term to denote a number of variants commonly named in the literature, including hate, abuse, offence, among others. Detecting toxicity in context is a non-trivial problem and has been addressed by very few previous studies. These previous studies have analysed the influence of conversational context in human perception of toxicity in controlled experiments and concluded that humans rarely change their judgements in the presence of context. They have also evaluated contextual detection models based on state-of-the-art Deep Learning and Natural Language Processing (NLP) techniques. Counterintuitively, they reached the general conclusion that computational models tend to suffer performance degradation in the presence of context. We challenge these empirical observations by devising better contextual predictive models that also rely on NLP data augmentation techniques to create larger and better data. In our study, we start by further analysing the human perception of toxicity in conversational data (i.e., tweets), in the absence versus presence of context, in this case, previous tweets in the same conversational thread. We observed that the conclusions of previous work on human perception are mainly due to data issues: The contextual data available does not provide sufficient evidence that context is indeed important (even for humans). The data problem is common in current toxicity datasets: cases labelled as toxic are either obviously toxic (i.e., overt toxicity with swear, racist, etc. words), and thus context does is not needed for a decision, or are ambiguous, vague or unclear even in the presence of context; in addition, the data contains labeling inconsistencies. To address this problem, we propose to automatically generate contextual samples where toxicity is not obvious (i.e., covert cases) without context or where different contexts can lead to different toxicity judgements for the same tweet. We generate toxic and non-toxic utterances conditioned on the context or on target tweets using a range of techniques for controlled text generation(e.g., Generative Adversarial Networks and steering techniques). On the contextual detection models, we posit that their poor performance is due to limitations on both of the data they are trained on (same problems stated above) and the architectures they use, which are not able to leverage context in effective ways. To improve on that, we propose text classification architectures that take the hierarchy of conversational utterances into account. In experiments benchmarking ours against previous models on existing and automatically generated data, we show that both data and architectural choices are very important. Our model achieves substantial performance improvements as compared to the baselines that are non-contextual or contextual but agnostic of the conversation structure.Keywords: contextual toxicity detection, data augmentation, hierarchical text classification models, natural language processing
Procedia PDF Downloads 17625878 AMBICOM: An Ambient Computing Middleware Architecture for Heterogeneous Environments
Authors: Ekrem Aksoy, Nihat Adar, Selçuk Canbek
Abstract:
Ambient Computing or Ambient Intelligence (AmI) is emerging area in computer science aiming to create intelligently connected environments and Internet of Things. In this paper, we propose communication middleware architecture for AmI. This middleware architecture addresses problems of communication, networking, and abstraction of applications, although there are other aspects (e.g. HCI and Security) within general AmI framework. Within this middleware architecture, any application developer might address HCI and Security issues with extensibility features of this platform.Keywords: AmI, ambient computing, middleware, distributed-systems, software-defined networking
Procedia PDF Downloads 29225877 Application of Microbially Induced Calcite Precipitation Technology in Construction Materials: A Comprehensive Review of Waste Stream Contributions
Authors: Amir Sina Fouladi, Arul Arulrajah, Jian Chu, Suksun Horpibulsuk
Abstract:
Waste generation is a growing concern in many countries across the world, particularly in urban areas with high rates of population growth and industrialization. The increasing amount of waste generated from human activities has led to environmental, economic, and health issues. Improper disposal of waste can result in air and water pollution, land degradation, and the spread of diseases. Waste generation also consumes large amounts of natural resources and energy, leading to the depletion of valuable resources and contributing to greenhouse gas emissions. To address these concerns, there is a need for sustainable waste management practices that reduce waste generation and promote resource recovery and recycling. Amongst these, developing innovative technologies such as Microbially Induced Calcite Precipitation (MICP) in construction materials is an effective approach to transforming waste into valuable and sustainable applications. MICP is an environmentally friendly microbial-chemical technology that applies microorganisms and chemical reagents to biological processes to produce carbonate mineral. This substance can be an energy-efficient, cost-effective, sustainable solution to environmental and engineering challenges. Recent research has shown that waste streams can replace several MICP-chemical components in the cultivation media of microorganisms and cementation reagents (calcium sources and urea). In addition to its effectiveness in treating hazardous waste streams, MICP has been found to be cost-effective and sustainable solution applicable to various waste media. This comprehensive review paper aims to provide a thorough understanding of the environmental advantages and engineering applications of MICP technology, with a focus on the contribution of waste streams. It also provides researchers with guidance on how to identify and overcome the challenges that may arise applying the MICP technology using waste streams.Keywords: waste stream, microbially induced calcite precipitation, construction materials, sustainability
Procedia PDF Downloads 8225876 Osteoarthritis (OA): A Total Knee Replacement Surgery
Authors: Loveneet Kaur
Abstract:
Introduction: Osteoarthritis (OA) is one of the leading causes of disability, and the knee is the most commonly affected joint in the body. The last resort for treatment of knee OA is Total Knee Replacement (TKR) surgery. Despite numerous advances in prosthetic design, patients do not reach normal function after surgery. Current surgical decisions are made on 2D radiographs and patient interviews. Aims: The aim of this study was to compare knee kinematics pre and post-TKR surgery using computer-animated images of patient-specific models under everyday conditions. Methods: 7 subjects were recruited for the study. Subjects underwent 3D gait analysis during 4 everyday activities and medical imaging of the knee joint pre- and one-month post-surgery. A 3D model was created from each of the scans, and the kinematic gait analysis data was used to animate the images. Results: Improvements were seen in a range of motion in all 4 activities 1-year post-surgery. The preoperative 3D images provide detailed information on the anatomy of the osteoarthritic knee. The postoperative images demonstrate potential future problems associated with the implant. Although not accurate enough to be of clinical use, the animated data can provide valuable insight into what conditions cause damage to both the osteoarthritic and prosthetic knee joints. As the animated data does not require specialist training to view, the images can be utilized across the fields of health professionals and manufacturing in the assessment and treatment of patients pre and post-knee replacement surgery. Future improvements in the collection and processing of data may yield clinically useful data. Conclusion: Although not yet of clinical use, the potential application of 3D animations of the knee joint pre and post-surgery is widespread.Keywords: Orthoporosis, Ortharthritis, knee replacement, TKR
Procedia PDF Downloads 5625875 Study of the Influence of Refractory Nitride Additives on Hydrogen Storage Properties of Ti6Al4V-Based Materials Produced by Spark Plasma Sintering
Authors: John Olorunfemi Abe, Olawale Muhammed Popoola, Abimbola Patricia Idowu Popoola
Abstract:
Hydrogen is an appealing alternative to fossil fuels because of its abundance, low weight, high energy density, and relative lack of contaminants. However, its low density presents a number of storage challenges. Therefore, this work studies the influence of refractory nitride additives consisting of 5 wt. % each of hexagonal boron nitride (h-BN), titanium nitride (TiN), and aluminum nitride (AlN) on hydrogen storage and electrochemical characteristics of Ti6Al4V-based materials produced by spark plasma sintering. The microstructure and phase constituents of the sintered materials were characterized using scanning electron microscopy (in conjunction with energy-dispersive spectroscopy) and X-ray diffraction, respectively. Pressure-composition-temperature (PCT) measurements were used to assess the hydrogen absorption/desorption behavior, kinetics, and storage capacities of the sintered materials, respectively. The pure Ti6Al4V alloy displayed a two-phase (α+β) microstructure, while the modified composites exhibited apparent microstructural modifications with the appearance of nitride-rich secondary phases. It is found that the diffusion process controls the kinetics of the hydrogen absorption. Thus, a faster rate of hydrogen absorption at elevated temperatures ensued. The additives acted as catalysts, lowered the activation energy and accelerated the rate of hydrogen sorption in the composites relative to the monolithic alloy. Ti6Al4V-5 wt. % h-BN appears to be the most promising candidate for hydrogen storage (2.28 wt. %), followed by Ti6Al4V-5 wt. % TiN (2.09 wt. %), whereas Ti6Al4V-5 wt. % AlN shows the least hydrogen storage performance (1.35 wt. %). Accordingly, the developed hydride system (Ti6Al4V-5h-BN) may be competitive for use in applications involving short-range continuous vehicles (~50-100km) as well as stationary applications such as electrochemical devices, large-scale storage cylinders in hydrogen production locations, and hydrogen filling stations.Keywords: hydrogen storage, Ti6Al4V hydride system, pressure-composition-temperature measurements, refractory nitride additives, spark plasma sintering, Ti6Al4V-based materials
Procedia PDF Downloads 8025874 Improving the Gain of a Multiband Antenna by Adding an Artificial Magnetic Conductor Metasurface
Authors: Amira Bousselmi
Abstract:
This article presents a PIFA antenna designed for geolocation applications (GNSS) operating on 1.278 GHz, 2.8 GHz, 5.7 GHz and 10 GHz. To improve the performance of the antenna, an artificial magnetic conductor structure (AMC) was used. Adding the antenna with AMC resulted in a measured gain of 4.78 dBi. The results of simulations and measurements are presented. CST Microwave Studio is used to design and compare antenna performance. An antenna design methodology, design and characterization of the AMC surface are described as well as the simulated and measured performances of the AMC antenna are then discussed. Finally, in Section V, there is a conclusion.Keywords: antenna multiband, global navigation system, AMC, Galeleo
Procedia PDF Downloads 8025873 Diversifying from Petroleum Products to Arable Farming as Source of Revenue Generation in Nigeria: A Case Study of Ondo West Local Government
Authors: A. S. Akinbani
Abstract:
Overdependence on petroleum is causing set back in Nigeria economy. Field survey was carried out to assess the profitability and production of selected arable crops in six selected towns and villages of Ondo southwestern. Data were collected from 240 arable crop farmers with the aid of both primary and secondary data. Data were collected with the use of oral interview and structured questionnaires. Data collected were analyzed using both descriptive and inferential statistics. Forty farmers were randomly selected to give a total number of 240 respondents. 84 farmers interviewed had no formal education, 72 had primary education, 50 farmers attained secondary education while 38 attained beyond secondary education. The majority of the farmers hold less than 10 acres of land. The data collected from the field showed that 192 farmers practiced mixed cropping which includes mixtures of yam, cowpea, cocoyam, vegetable, cassava and maize while only 48 farmers practiced monocropping. Among the sampled farmers, 93% agreed that arable production is profitable while 7% disagreed. The findings show that managerial practices that conserve the soil fertility and reduce labor cost such as planting of leguminous crops and herbicide application instead of using hand held hoe for weeding should be encouraged. All the respondents agreed that yam, cowpea, cocoyam, sweet potato, rice, maize and vegetable production will solve the problem of hunger and increase standard of living compared with petroleum product that Nigeria relied on as means of livelihood.Keywords: farmers, arable crop, cocoyam, respondents, maize
Procedia PDF Downloads 25525872 Handling Missing Data by Using Expectation-Maximization and Expectation-Maximization with Bootstrapping for Linear Functional Relationship Model
Authors: Adilah Abdul Ghapor, Yong Zulina Zubairi, A. H. M. R. Imon
Abstract:
Missing value problem is common in statistics and has been of interest for years. This article considers two modern techniques in handling missing data for linear functional relationship model (LFRM) namely the Expectation-Maximization (EM) algorithm and Expectation-Maximization with Bootstrapping (EMB) algorithm using three performance indicators; namely the mean absolute error (MAE), root mean square error (RMSE) and estimated biased (EB). In this study, we applied the methods of imputing missing values in two types of LFRM namely the full model of LFRM and in LFRM when the slope is estimated using a nonparametric method. Results of the simulation study suggest that EMB algorithm performs much better than EM algorithm in both models. We also illustrate the applicability of the approach in a real data set.Keywords: expectation-maximization, expectation-maximization with bootstrapping, linear functional relationship model, performance indicators
Procedia PDF Downloads 45625871 A Comparative Study of Environment Risk Assessment Guidelines of Developing and Developed Countries Including Bangladesh
Authors: Syeda Fahria Hoque Mimmi, Aparna Islam
Abstract:
Genetically engineered (GE) plants are the need of time for increased demand for food. A complete set of regulations need to be followed from the development of a GE plant to its release into the environment. The whole regulation system is categorized into separate stages for maintaining the proper biosafety. Environmental risk assessment (ERA) is one of such crucial stages in the whole process. ERA identifies potential risks and their impacts through science-based evaluation where it is done in a case-by-case study. All the countries which deal with GE plants follow specific guidelines to conduct a successful ERA. In this study, ERA guidelines of 4 developing and 4 developed countries, including Bangladesh, were compared. ERA guidelines of countries such as India, Canada, Australia, the European Union, Argentina, Brazil, and the US were considered as a model to conduct the comparison study with Bangladesh. Initially, ten parameters were detected to compare the required data and information among all the guidelines. Surprisingly, an adequate amount of data and information requirements (e.g., if the intended modification/new traits of interest has been achieved or not, the growth habit of GE plants, consequences of any potential gene flow upon the cultivation of GE plants to sexually compatible plant species, potential adverse effects on the human health, etc.) matched between all the countries. However, a few differences in data requirement (e.g., agronomic conventions of non-transformed plants, applicants should clearly describe experimental procedures followed, etc.) were also observed in the study. Moreover, it was found that only a few countries provide instructions on the quality of the data used for ERA. If these similarities are recognized in a more framed manner, then the approval pathway of GE plants can be shared.Keywords: GE plants, ERA, harmonization, ERA guidelines, Information and data requirements
Procedia PDF Downloads 19125870 Creation of a Realistic Railway Simulator Developed on a 3D Graphic Game Engine Using a Numerical Computing Programming Environment
Authors: Kshitij Ansingkar, Yohei Hoshino, Liangliang Yang
Abstract:
Advances in algorithms related to autonomous systems have made it possible to research on improving the accuracy of a train’s location. This has the capability of increasing the throughput of a railway network without the need for the creation of additional infrastructure. To develop such a system, the railway industry requires data to test sensor fusion theories or implement simultaneous localization and mapping (SLAM) algorithms. Though such simulation data and ground truth datasets are available for testing automation algorithms of vehicles, however, due to regulations and economic considerations, there is a dearth of such datasets in the railway industry. Thus, there is a need for the creation of a simulation environment that can generate realistic synthetic datasets. This paper proposes (1) to leverage the capabilities of open-source 3D graphic rendering software to create a visualization of the environment. (2) to utilize open-source 3D geospatial data for accurate visualization and (3) to integrate the graphic rendering software with a programming language and numerical computing platform. To develop such an integrated platform, this paper utilizes the computing platform’s advanced sensor models like LIDAR, camera, IMU or GPS and merges it with the 3D rendering of the game engine to generate high-quality synthetic data. Further, these datasets can be used to train Railway models and improve the accuracy of a train’s location.Keywords: 3D game engine, 3D geospatial data, dataset generation, railway simulator, sensor fusion, SLAM
Procedia PDF Downloads 1825869 Thickness-Tunable Optical, Magnetic, and Dielectric Response of Lithium Ferrite Thin Film Synthesized by Pulsed Laser Deposition
Authors: Prajna Paramita Mohapatra, Pamu Dobbidi
Abstract:
Lithium ferrite (LiFe5O8) has potential applications as a component of microwave magnetic devices such as circulators and monolithic integrated circuits. For efficient device applications, spinel ferrites in the form of thin films are highly required. It is necessary to improve their magnetic and dielectric behavior by optimizing the processing parameters during deposition. The lithium ferrite thin films are deposited on Pt/Si substrate using the pulsed laser deposition technique (PLD). As controlling the film thickness is the easiest parameter to tailor the strain, we deposited the thin films having different film thicknesses (160 nm, 200 nm, 240 nm) at oxygen partial pressure of 0.001 mbar. The formation of single phase with spinel structure (space group - P4132) is confirmed by the XRD pattern and the Rietveld analysis. The optical bandgap is decreased with the increase in thickness. FESEM confirmed the formation of uniform grains having well separated grain boundaries. Further, the film growth and the roughness are analyzed by AFM. The root-mean-square (RMS) surface roughness is decreased from 13.52 nm (160 nm) to 9.34 nm (240 nm). The room temperature magnetization is measured with a maximum field of 10 kOe. The saturation magnetization is enhanced monotonically with an increase in thickness. The magnetic resonance linewidth is obtained in the range of 450 – 780 Oe. The dielectric response is measured in the frequency range of 104 – 106 Hz and in the temperature range of 303 – 473 K. With an increase in frequency, the dielectric constant and the loss tangent of all the samples decreased continuously, which is a typical behavior of conventional dielectric material. The real part of the dielectric constant and the dielectric loss is increased with an increase in thickness. The contribution of grain and grain boundaries is also analyzed by employing the equivalent circuit model. The highest dielectric constant is obtained for the film having a thickness of 240 nm at 104 Hz. The obtained results demonstrate that desired response can be obtained by tailoring the film thickness for the microwave magnetic devices.Keywords: PLD, optical response, thin films, magnetic response, dielectric response
Procedia PDF Downloads 10325868 In-service High School Teachers’ Experiences On Blended Teaching Approach Of Mathematics
Authors: Lukholo Raxangana
Abstract:
Fourth Industrial Revolution (4IR)-era teaching offers in-service mathematics teachers opportunities to use blended approaches to engage learners while teaching mathematics. This study explores in-service high school teachers' experiences with a blended teaching approach to mathematics. This qualitative case study involved eight pre-service teachers from four selected schools in the Sedibeng West District of the Gauteng Province. The study used the community of inquiry model as its analytical framework for data analysis. Data collection was through semi-structured interviews and focus-group discussions to explore in-service teachers' experiences with the influence of blended teaching (BT) on learning mathematics. The study results are the impact of load-shedding, benefits of BT, and perceptions of in-service and hindrances of BT. Based on these findings, the study recommends that further research should focus on developing data-free BT tools to assist during load-shedding, regardless of location.Keywords: bended teaching, teachers, in-service, and mathematics
Procedia PDF Downloads 6125867 Auditory Brainstem Response in Wave VI for the Detection of Learning Disabilities
Authors: Maria Isabel Garcia-Planas, Maria Victoria Garcia-Camba
Abstract:
The use of brain stem auditory evoked potential (BAEP) is a common way to study the auditory function of people, a way to learn the functionality of a part of the brain neuronal groups that intervene in the learning process by studying the behaviour of wave VI. The latest advances in neuroscience have revealed the existence of different brain activity in the learning process that can be highlighted through the use of innocuous, low-cost, and easy-access techniques such as, among others, the BAEP that can help us to detect early possible neurodevelopmental difficulties for their subsequent assessment and cure. To date and to the authors' best knowledge, only the latency data obtained, observing the first to V waves and mainly in the left ear, were taken into account. This work shows that it is essential to take into account both ears; with these latest data, it has been possible had diagnosed more precise some cases than with the previous data had been diagnosed as 'normal' despite showing signs of some alteration that motivated the new consultation to the specialist.Keywords: ear, neurodevelopment, auditory evoked potentials, intervals of normality, learning disabilities
Procedia PDF Downloads 16925866 Quantum Cryptography: Classical Cryptography Algorithms’ Vulnerability State as Quantum Computing Advances
Authors: Tydra Preyear, Victor Clincy
Abstract:
Quantum computing presents many computational advantages over classical computing methods due to the utilization of quantum mechanics. The capability of this computing infrastructure poses threats to standard cryptographic systems such as RSA and AES, which are designed for classical computing environments. This paper discusses the impact that quantum computing has on cryptography, while focusing on the evolution from classical cryptographic concepts to quantum and post-quantum cryptographic concepts. Standard Cryptography is essential for securing data by utilizing encryption and decryption methods, and these methods face vulnerability problems due to the advancement of quantum computing. In order to counter these vulnerabilities, the methods that are proposed are quantum cryptography and post-quantum cryptography. Quantum cryptography uses principles such as the uncertainty principle and photon polarization in order to provide secure data transmission. In addition, the concept of Quantum key distribution is introduced to ensure more secure communication channels by distributing cryptographic keys. There is the emergence of post-quantum cryptography which is used for improving cryptographic algorithms in order to be more secure from attacks by classical and quantum computers. Throughout this exploration, the paper mentions the critical role of the advancement of cryptographic methods to keep data integrity and privacy safe from quantum computing concepts. Future research directions that would be discussed would be more effective cryptographic methods through the advancement of technology.Keywords: quantum computing, quantum cryptography, cryptography, data integrity and privacy
Procedia PDF Downloads 3225865 Intelligent Electric Vehicle Charging System (IEVCS)
Authors: Prateek Saxena, Sanjeev Singh, Julius Roy
Abstract:
The security of the power distribution grid remains a paramount to the utility professionals while enhancing and making it more efficient. The most serious threat to the system can be maintaining the transformers, as the load is ever increasing with the addition of elements like electric vehicles. In this paper, intelligent transformer monitoring and grid management has been proposed. The engineering is done to use the evolving data from the smart meter for grid analytics and diagnostics for preventive maintenance. The two-tier architecture for hardware and software integration is coupled to form a robust system for the smart grid. The proposal also presents interoperable meter standards for easy integration. Distribution transformer analytics based on real-time data benefits utilities preventing outages, protects the revenue loss, improves the return on asset and reduces overall maintenance cost by predictive monitoring.Keywords: electric vehicle charging, transformer monitoring, data analytics, intelligent grid
Procedia PDF Downloads 79425864 Self-Organizing Maps for Credit Card Fraud Detection
Authors: ChunYi Peng, Wei Hsuan CHeng, Shyh Kuang Ueng
Abstract:
This study focuses on the application of self-organizing maps (SOM) technology in analyzing credit card transaction data, aiming to enhance the accuracy and efficiency of fraud detection. Som, as an artificial neural network, is particularly suited for pattern recognition and data classification, making it highly effective for the complex and variable nature of credit card transaction data. By analyzing transaction characteristics with SOM, the research identifies abnormal transaction patterns that could indicate potentially fraudulent activities. Moreover, this study has developed a specialized visualization tool to intuitively present the relationships between SOM analysis outcomes and transaction data, aiding financial institution personnel in quickly identifying and responding to potential fraud, thereby reducing financial losses. Additionally, the research explores the integration of SOM technology with composite intelligent system technologies (including finite state machines, fuzzy logic, and decision trees) to further improve fraud detection accuracy. This multimodal approach provides a comprehensive perspective for identifying and understanding various types of fraud within credit card transactions. In summary, by integrating SOM technology with visualization tools and composite intelligent system technologies, this research offers a more effective method of fraud detection for the financial industry, not only enhancing detection accuracy but also deepening the overall understanding of fraudulent activities.Keywords: self-organizing map technology, fraud detection, information visualization, data analysis, composite intelligent system technologies, decision support technologies
Procedia PDF Downloads 6425863 Robust Recognition of Locomotion Patterns via Data-Driven Machine Learning in the Cloud Environment
Authors: Shinoy Vengaramkode Bhaskaran, Kaushik Sathupadi, Sandesh Achar
Abstract:
Human locomotion recognition is important in a variety of sectors, such as robotics, security, healthcare, fitness tracking and cloud computing. With the increasing pervasiveness of peripheral devices, particularly Inertial Measurement Units (IMUs) sensors, researchers have attempted to exploit these advancements in order to precisely and efficiently identify and categorize human activities. This research paper introduces a state-of-the-art methodology for the recognition of human locomotion patterns in a cloud environment. The methodology is based on a publicly available benchmark dataset. The investigation implements a denoising and windowing strategy to deal with the unprocessed data. Next, feature extraction is adopted to abstract the main cues from the data. The SelectKBest strategy is used to abstract optimal features from the data. Furthermore, state-of-the-art ML classifiers are used to evaluate the performance of the system, including logistic regression, random forest, gradient boosting and SVM have been investigated to accomplish precise locomotion classification. Finally, a detailed comparative analysis of results is presented to reveal the performance of recognition models.Keywords: artificial intelligence, cloud computing, IoT, human locomotion, gradient boosting, random forest, neural networks, body-worn sensors
Procedia PDF Downloads 1725862 Feature Selection of Personal Authentication Based on EEG Signal for K-Means Cluster Analysis Using Silhouettes Score
Authors: Jianfeng Hu
Abstract:
Personal authentication based on electroencephalography (EEG) signals is one of the important field for the biometric technology. More and more researchers have used EEG signals as data source for biometric. However, there are some disadvantages for biometrics based on EEG signals. The proposed method employs entropy measures for feature extraction from EEG signals. Four type of entropies measures, sample entropy (SE), fuzzy entropy (FE), approximate entropy (AE) and spectral entropy (PE), were deployed as feature set. In a silhouettes calculation, the distance from each data point in a cluster to all another point within the same cluster and to all other data points in the closest cluster are determined. Thus silhouettes provide a measure of how well a data point was classified when it was assigned to a cluster and the separation between them. This feature renders silhouettes potentially well suited for assessing cluster quality in personal authentication methods. In this study, “silhouettes scores” was used for assessing the cluster quality of k-means clustering algorithm is well suited for comparing the performance of each EEG dataset. The main goals of this study are: (1) to represent each target as a tuple of multiple feature sets, (2) to assign a suitable measure to each feature set, (3) to combine different feature sets, (4) to determine the optimal feature weighting. Using precision/recall evaluations, the effectiveness of feature weighting in clustering was analyzed. EEG data from 22 subjects were collected. Results showed that: (1) It is possible to use fewer electrodes (3-4) for personal authentication. (2) There was the difference between each electrode for personal authentication (p<0.01). (3) There is no significant difference for authentication performance among feature sets (except feature PE). Conclusion: The combination of k-means clustering algorithm and silhouette approach proved to be an accurate method for personal authentication based on EEG signals.Keywords: personal authentication, K-mean clustering, electroencephalogram, EEG, silhouettes
Procedia PDF Downloads 28925861 Developing an Active Leisure Wear Range: A Dilemma for Khanna Enterprises
Authors: Jagriti Mishra, Vasundhara Chaudhary
Abstract:
Introduction: The case highlights various issues and challenges faced by Khanna Enterprises while conceptualizing and execution of launching Active Leisure wear in the domestic market, where different steps involved in the range planning and production have been elaborated. Although Khanna Enterprises was an established company which dealt in the production of knitted and woven garments, they took the risk of launching a new concept- Active Leisure wear for Millennials. Methodology: It is based on primary and secondary research where data collection has been done through survey, in-depth interviews and various reports, forecasts, and journals. Findings: The research through primary and secondary data and execution of active leisure wear substantiated the acceptance, not only by the millennials but also by the generation X. There was a demand of bigger sizes as well as more muted colours. Conclusion: The sales data paved the way for future product development in tune with the strengths of Khanna Enterprises.Keywords: millennials, range planning, production, active leisure wear
Procedia PDF Downloads 21225860 A Review of Data Visualization Best Practices: Lessons for Open Government Data Portals
Authors: Bahareh Ansari
Abstract:
Background: The Open Government Data (OGD) movement in the last decade has encouraged many government organizations around the world to make their data publicly available to advance democratic processes. But current open data platforms have not yet reached to their full potential in supporting all interested parties. To make the data useful and understandable for everyone, scholars suggested that opening the data should be supplemented by visualization. However, different visualizations of the same information can dramatically change an individual’s cognitive and emotional experience in working with the data. This study reviews the data visualization literature to create a list of the methods empirically tested to enhance users’ performance and experience in working with a visualization tool. This list can be used in evaluating the OGD visualization practices and informing the future open data initiatives. Methods: Previous reviews of visualization literature categorized the visualization outcomes into four categories including recall/memorability, insight/comprehension, engagement, and enjoyment. To identify the papers, a search for these outcomes was conducted in the abstract of the publications of top-tier visualization venues including IEEE Transactions for Visualization and Computer Graphics, Computer Graphics, and proceedings of the CHI Conference on Human Factors in Computing Systems. The search results are complemented with a search in the references of the identified articles, and a search for 'open data visualization,' and 'visualization evaluation' keywords in the IEEE explore and ACM digital libraries. Articles are included if they provide empirical evidence through conducting controlled user experiments, or provide a review of these empirical studies. The qualitative synthesis of the studies focuses on identification and classifying the methods, and the conditions under which they are examined to positively affect the visualization outcomes. Findings: The keyword search yields 760 studies, of which 30 are included after the title/abstract review. The classification of the included articles shows five distinct methods: interactive design, aesthetic (artistic) style, storytelling, decorative elements that do not provide extra information including text, image, and embellishment on the graphs), and animation. Studies on decorative elements show consistency on the positive effects of these elements on user engagement and recall but are less consistent in their examination of the user performance. This inconsistency could be attributable to the particular data type or specific design method used in each study. The interactive design studies are consistent in their findings of the positive effect on the outcomes. Storytelling studies show some inconsistencies regarding the design effect on user engagement, enjoyment, recall, and performance, which could be indicative of the specific conditions required for the use of this method. Last two methods, aesthetics and animation, have been less frequent in the included articles, and provide consistent positive results on some of the outcomes. Implications for e-government: Review of the visualization best-practice methods show that each of these methods is beneficial under specific conditions. By using these methods in a potentially beneficial condition, OGD practices can promote a wide range of individuals to involve and work with the government data and ultimately engage in government policy-making procedures.Keywords: best practices, data visualization, literature review, open government data
Procedia PDF Downloads 11125859 Phytoextraction of Copper and Zinc by Willow Varieties in a Pot Experiment
Authors: Muhammad Mohsin, Mir Md Abdus Salam, Pertti Pulkkinen, Ari Pappinen
Abstract:
Soil and water contamination by heavy metals is a major challenging issue for the environment. Phytoextraction is an emerging, environmentally friendly and cost-efficient technology in which plants are used to eliminate pollutants from the soil and water. We aimed to assess the copper (Cu) and zinc (Zn) removal efficiency by two willow varieties such as Klara (S. viminalis x S. schwerinii x S. dasyclados) and Karin ((S.schwerinii x S. viminalis) x (S. viminalis x S.burjatica)) under different soil treatments (control/unpolluted, polluted, lime with polluted, wood ash with polluted). In 180 days of pot experiment, these willow varieties were grown in a highly polluted soil collected from Pyhasalmi mining area in Finland. The lime and wood ash were added to the polluted soil to improve the soil pH and observe their effects on metals accumulation in plant biomass. The Inductively Coupled Plasma Optical Emission Spectrometer (ELAN 6000 ICP-EOS, Perkin-Elmer Corporation) was used in this study to assess the heavy metals concentration in the plant biomass. The result shows that both varieties of willow have the capability to accumulate the considerable amount of Cu and Zn varying from 36.95 to 314.80 mg kg⁻¹ and 260.66 to 858.70 mg kg⁻¹, respectively. The application of lime and wood ash substantially affected the stimulation of the plant height, dry biomass and deposition of Cu and Zn into total plant biomass. Besides, the lime application appeared to upsurge Cu and Zn concentrations in the shoots and leaves in both willow varieties when planted in polluted soil. However, wood ash application was found more efficient to mobilize the metals in the roots of both varieties. The study recommends willow plantations to rehabilitate the Cu and Zn polluted soils.Keywords: heavy metals, lime, phytoextraction, wood ash, willow
Procedia PDF Downloads 24025858 Reduced Power Consumption by Randomization for DSI3
Authors: David Levy
Abstract:
The newly released Distributed System Interface 3 (DSI3) Bus Standard specification defines 3 modulation levels from which 16 valid symbols are coded. This structure creates power consumption variations depending on the transmitted data of a factor of more than 2 between minimum and maximum. The power generation unit has to consider therefore the worst case maximum consumption all the time and be built accordingly. This paper proposes a method to reduce both the average current consumption and worst case current consumption. The transmitter randomizes the data using several pseudo-random sequences. It then estimates the energy consumption of the generated frames and selects to transmit the one which consumes the least. The transmitter also prepends the index of the pseudo-random sequence, which is not randomized, to allow the receiver to recover the original data using the correct sequence. We show that in the case that the frame occupies most of the DSI3 synchronization period, we achieve average power consumption reduction by up to 13% and the worst case power consumption is reduced by 17.7%.Keywords: DSI3, energy, power consumption, randomization
Procedia PDF Downloads 53925857 Present-Day Transformations and Trends in Rooftop Agriculture and Food Security
Authors: Kiara Lawrence, Nadine Ponnusamy, Clive Greenstone
Abstract:
One of the major challenges facing society today is food security. The risks to food security have increased significantly due to the evolving urban landscape, globalization, and a rising population. The cultivation of food is essential, particularly during times of crisis, such as a recession, and has long been a necessity for urban populations. In contemporary society, many urban residents are confronted with new challenges, including high levels of unemployment, which compel individuals to adopt alternative survival strategies, such as growing their own food. Recently, rooftop agriculture has made significant contributions to urban and national food security and has been utilized as a tool to mitigate the frequent and damaging disasters that many cities encounter. They have the potential to transform unused spaces into green, productive vegetable plots, while also providing urban residents with the opportunity to enjoy the benefits of gardening. Therefore, this study looks to investigate the evolving themes around rooftop agriculture and food security globally. A bibliometric review analysis was carried out on Scopus and Web of Science using the keywords “rooftop agriculture” OR “rooftop farming” OR “rooftop garden” AND “food security” between 2004 and 2024 to ensure a broader scope was covered around the chosen study. Vosviewer software was then utilized to analyze the extracted data to create network visualization maps based on keyword occurrences, co-author analysis, country analysis. There were only 37 relevant documents within the study parameters. Preliminary results indicate that much research focused on urban agriculture, food supply, green roof, sustainability and climate change. By analysing these aspects of rooftop agriculture and food security, the trends can identify gaps in literature and dictate future applications to assist in food security.Keywords: food security, rooftop agriculture, rooftop farming, rooftop garden
Procedia PDF Downloads 2425856 A Survey of Digital Health Companies: Opportunities and Business Model Challenges
Authors: Iris Xiaohong Quan
Abstract:
The global digital health market reached 175 billion U.S. dollars in 2019, and is expected to grow at about 25% CAGR to over 650 billion USD by 2025. Different terms such as digital health, e-health, mHealth, telehealth have been used in the field, which can sometimes cause confusion. The term digital health was originally introduced to refer specifically to the use of interactive media, tools, platforms, applications, and solutions that are connected to the Internet to address health concerns of providers as well as consumers. While mHealth emphasizes the use of mobile phones in healthcare, telehealth means using technology to remotely deliver clinical health services to patients. According to FDA, “the broad scope of digital health includes categories such as mobile health (mHealth), health information technology (IT), wearable devices, telehealth and telemedicine, and personalized medicine.” Some researchers believe that digital health is nothing else but the cultural transformation healthcare has been going through in the 21st century because of digital health technologies that provide data to both patients and medical professionals. As digital health is burgeoning, but research in the area is still inadequate, our paper aims to clear the definition confusion and provide an overall picture of digital health companies. We further investigate how business models are designed and differentiated in the emerging digital health sector. Both quantitative and qualitative methods are adopted in the research. For the quantitative analysis, our research data came from two databases Crunchbase and CBInsights, which are well-recognized information sources for researchers, entrepreneurs, managers, and investors. We searched a few keywords in the Crunchbase database based on companies’ self-description: digital health, e-health, and telehealth. A search of “digital health” returned 941 unique results, “e-health” returned 167 companies, while “telehealth” 427. We also searched the CBInsights database for similar information. After merging and removing duplicate ones and cleaning up the database, we came up with a list of 1464 companies as digital health companies. A qualitative method will be used to complement the quantitative analysis. We will do an in-depth case analysis of three successful unicorn digital health companies to understand how business models evolve and discuss the challenges faced in this sector. Our research returned some interesting findings. For instance, we found that 86% of the digital health startups were founded in the recent decade since 2010. 75% of the digital health companies have less than 50 employees, and almost 50% with less than 10 employees. This shows that digital health companies are relatively young and small in scale. On the business model analysis, while traditional healthcare businesses emphasize the so-called “3P”—patient, physicians, and payer, digital health companies extend to “5p” by adding patents, which is the result of technology requirements (such as the development of artificial intelligence models), and platform, which is an effective value creation approach to bring the stakeholders together. Our case analysis will detail the 5p framework and contribute to the extant knowledge on business models in the healthcare industry.Keywords: digital health, business models, entrepreneurship opportunities, healthcare
Procedia PDF Downloads 18725855 Comparison of Mechanical Property of UNS C12200Joints Brazed by (Cu&Ag) Based Filler Metals
Authors: Ali Elhatmi, Mustafa Elshbo, Hussin Alosta
Abstract:
In this study the coper tube witch used in medical applications was brazed by Copper, Zink and Silver alloys, using BCuP2, RBCuZnAl and BAg2 filler metals. The sample of the medical tubes was chemically analyzed and the result matches the British standard. Tensile and hardness tests were carried out for brazed joints, and the tensile test results show that the BCuP2 has the hardest and the filler metal RBCuZnAl has the highest tensile strength.Keywords: welding, Brazing, Copper tubes, Joints
Procedia PDF Downloads 23325854 Ensemble-Based SVM Classification Approach for miRNA Prediction
Authors: Sondos M. Hammad, Sherin M. ElGokhy, Mahmoud M. Fahmy, Elsayed A. Sallam
Abstract:
In this paper, an ensemble-based Support Vector Machine (SVM) classification approach is proposed. It is used for miRNA prediction. Three problems, commonly associated with previous approaches, are alleviated. These problems arise due to impose assumptions on the secondary structural of premiRNA, imbalance between the numbers of the laboratory checked miRNAs and the pseudo-hairpins, and finally using a training data set that does not consider all the varieties of samples in different species. We aggregate the predicted outputs of three well-known SVM classifiers; namely, Triplet-SVM, Virgo and Mirident, weighted by their variant features without any structural assumptions. An additional SVM layer is used in aggregating the final output. The proposed approach is trained and then tested with balanced data sets. The results of the proposed approach outperform the three base classifiers. Improved values for the metrics of 88.88% f-score, 92.73% accuracy, 90.64% precision, 96.64% specificity, 87.2% sensitivity, and the area under the ROC curve is 0.91 are achieved.Keywords: MiRNAs, SVM classification, ensemble algorithm, assumption problem, imbalance data
Procedia PDF Downloads 354