Search results for: field data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 30395

Search results for: field data

28775 Molecular Genetic Purity Test Using SSR Markers in Pigeon Pea

Authors: Rakesh C. Mathad, G. Y. Lokesh, Basavegowda

Abstract:

In agriculture using quality seeds of improved varieties is very important to ensure higher productivity thereby food security and sustainability. To ensure good productivity, seeds should have characters as described by the breeder. To know whether the characters as described by the breeder are expressing in a variety such as genuineness or genetic purity, field grow out test (GOT) is done. In pigeon pea which is long durational crop, conducting a GOT may take very long time and expensive also. Since in pigeon pea flower character is a most distinguishing character from the contaminants, conducting a field grow out test require 120-130 days or till flower emergence, which may increase cost of storage and seed production. This will also delay the distribution of seed inventory to the pigeon pea growing areas. In this view during 2014-15 with financial support of Govt. of Karnataka, India, a project to develop a molecular genetic test for newly developed variety of pigeon pea cv.TS3R was commissioned at Seed Unit, UAS, Raichur. A molecular test was developed with the help SSR markers to identify pure variety from possible off types in newly released pigeon pea variety TS3R. In the investigation, 44 primer pairs were screened to identify the specific marker associated with this variety. Pigeon pea cv. TS3R could be clearly identified by using the primer CCM 293 based on the banding pattern resolved on gel electrophoresis and PCR reactions. However some of the markers like AHSSR 46, CCM 82 and CCM 57 can be used to test other popular varieties in the region like Asha, GRG-811 and Maruti respectively. Further to develop this in to a lab test, the seed sample size was standardized to 200 seeds and a grow out matrix was developed. This matrix was used to sample 12 days old leaves to extract DNA. The lab test results were validated with actual field GOT test results and found variations within the acceptable limit of 1%. This molecular method can now be employed to test the genetic purity in pigeon pea cv TS3R which reduces the time and can be a cheaper alternative method for field GOT.

Keywords: genuineness, grow-out matrix, molecular genetic purity, SSR markers

Procedia PDF Downloads 276
28774 Traffic Safety and Risk Assessment Model by Analysis of Questionnaire Survey: A Case Study of S. G. Highway, Ahmedabad, India

Authors: Abhijitsinh Gohil, Kaushal Wadhvaniya, Kuldipsinh Jadeja

Abstract:

Road Safety is a multi-sectoral and multi-dimensional issue. An effective model can assess the risk associated with highway safety. A questionnaire survey is very essential to identify the events or activities which are causing unsafe condition for traffic on an urban highway. A questionnaire of standard questions including vehicular, human and infrastructure characteristics can be made. Responses from the age wise group of road users can be taken on field. Each question or an event holds a specific risk weightage, which contributes in creating an inappropriate and unsafe flow of traffic. The probability of occurrence of an event can be calculated from the data collected from the road users. Finally, the risk score can be calculated by considering the risk factor and the probability of occurrence of individual event and addition of all risk score for the individual event will give the total risk score of a particular road. Standards for risk score can be made and total risk score can be compared with the standards. Thus road can be categorized based on risk associated and traffic safety on it. With this model, one can assess the need for traffic safety improvement on a given road, and qualitative data can be analysed.

Keywords: probability of occurrence, questionnaire, risk factor, risk score

Procedia PDF Downloads 333
28773 Extreme Temperature Forecast in Mbonge, Cameroon Through Return Level Analysis of the Generalized Extreme Value (GEV) Distribution

Authors: Nkongho Ayuketang Arreyndip, Ebobenow Joseph

Abstract:

In this paper, temperature extremes are forecast by employing the block maxima method of the generalized extreme value (GEV) distribution to analyse temperature data from the Cameroon Development Corporation (CDC). By considering two sets of data (raw data and simulated data) and two (stationary and non-stationary) models of the GEV distribution, return levels analysis is carried out and it was found that in the stationary model, the return values are constant over time with the raw data, while in the simulated data the return values show an increasing trend with an upper bound. In the non-stationary model, the return levels of both the raw data and simulated data show an increasing trend with an upper bound. This clearly shows that although temperatures in the tropics show a sign of increase in the future, there is a maximum temperature at which there is no exceedance. The results of this paper are very vital in agricultural and environmental research.

Keywords: forecasting, generalized extreme value (GEV), meteorology, return level

Procedia PDF Downloads 469
28772 Impact of Stack Caches: Locality Awareness and Cost Effectiveness

Authors: Abdulrahman K. Alshegaifi, Chun-Hsi Huang

Abstract:

Treating data based on its location in memory has received much attention in recent years due to its different properties, which offer important aspects for cache utilization. Stack data and non-stack data may interfere with each other’s locality in the data cache. One of the important aspects of stack data is that it has high spatial and temporal locality. In this work, we simulate non-unified cache design that split data cache into stack and non-stack caches in order to maintain stack data and non-stack data separate in different caches. We observe that the overall hit rate of non-unified cache design is sensitive to the size of non-stack cache. Then, we investigate the appropriate size and associativity for stack cache to achieve high hit ratio especially when over 99% of accesses are directed to stack cache. The result shows that on average more than 99% of stack cache accuracy is achieved by using 2KB of capacity and 1-way associativity. Further, we analyze the improvement in hit rate when adding small, fixed, size of stack cache at level1 to unified cache architecture. The result shows that the overall hit rate of unified cache design with adding 1KB of stack cache is improved by approximately, on average, 3.9% for Rijndael benchmark. The stack cache is simulated by using SimpleScalar toolset.

Keywords: hit rate, locality of program, stack cache, stack data

Procedia PDF Downloads 298
28771 Applications of Artificial Neural Networks in Civil Engineering

Authors: Naci Büyükkaracığan

Abstract:

Artificial neural networks (ANN) is an electrical model based on the human brain nervous system and working principle. Artificial neural networks have been the subject of an active field of research that has matured greatly over the past 55 years. ANN now is used in many fields. But, it has been viewed that artificial neural networks give better results in particular optimization and control systems. There are requirements of optimization and control system in many of the area forming the subject of civil engineering applications. In this study, the first artificial intelligence systems are widely used in the solution of civil engineering systems were examined with the basic principles and technical aspects. Finally, the literature reviews for applications in the field of civil engineering were conducted and also artificial intelligence techniques were informed about the study and its results.

Keywords: artificial neural networks, civil engineering, Fuzzy logic, statistics

Procedia PDF Downloads 400
28770 AI Applications in Accounting: Transforming Finance with Technology

Authors: Alireza Karimi

Abstract:

Artificial Intelligence (AI) is reshaping various industries, and accounting is no exception. With the ability to process vast amounts of data quickly and accurately, AI is revolutionizing how financial professionals manage, analyze, and report financial information. In this article, we will explore the diverse applications of AI in accounting and its profound impact on the field. Automation of Repetitive Tasks: One of the most significant contributions of AI in accounting is automating repetitive tasks. AI-powered software can handle data entry, invoice processing, and reconciliation with minimal human intervention. This not only saves time but also reduces the risk of errors, leading to more accurate financial records. Pattern Recognition and Anomaly Detection: AI algorithms excel at pattern recognition. In accounting, this capability is leveraged to identify unusual patterns in financial data that might indicate fraud or errors. AI can swiftly detect discrepancies, enabling auditors and accountants to focus on resolving issues rather than hunting for them. Real-Time Financial Insights: AI-driven tools, using natural language processing and computer vision, can process documents faster than ever. This enables organizations to have real-time insights into their financial status, empowering decision-makers with up-to-date information for strategic planning. Fraud Detection and Prevention: AI is a powerful tool in the fight against financial fraud. It can analyze vast transaction datasets, flagging suspicious activities and reducing the likelihood of financial misconduct going unnoticed. This proactive approach safeguards a company's financial integrity. Enhanced Data Analysis and Forecasting: Machine learning, a subset of AI, is used for data analysis and forecasting. By examining historical financial data, AI models can provide forecasts and insights, aiding businesses in making informed financial decisions and optimizing their financial strategies. Artificial Intelligence is fundamentally transforming the accounting profession. From automating mundane tasks to enhancing data analysis and fraud detection, AI is making financial processes more efficient, accurate, and insightful. As AI continues to evolve, its role in accounting will only become more significant, offering accountants and finance professionals powerful tools to navigate the complexities of modern finance. Embracing AI in accounting is not just a trend; it's a necessity for staying competitive in the evolving financial landscape.

Keywords: artificial intelligence, accounting automation, financial analysis, fraud detection, machine learning in finance

Procedia PDF Downloads 57
28769 The Backlift Technique among South African Cricket Players

Authors: Habib Noorbhai

Abstract:

This study primarily aimed to investigate the batting backlift technique (BBT) among semi-professional, professional and current international cricket players. A key question was to investigate if the lateral batting backlift technique (LBBT) is more common at the highest levels of the game. The participants in this study sample (n = 130) were South African semi-professional players (SP) (n = 69) and professional players (P) (n = 49) and South African international professional players (SAI) (n = 12). Biomechanical and video analysis were performed on all participant groups. Classifiers were utilised to identify the batting backlift technique type (BBTT) employed by all batsmen. All statistics and wagon wheels (scoring areas of the batsmen on a cricket field) were sourced online. This study found that a LBBT is more common at the highest levels of cricket batsmanship with batsmen at the various levels of cricket having percentages of the LBBT as follows: SP = 37.7%; P = 38.8%; SAI = 75%; p = 0.001. This study also found that SAI batsmen who used the LBBT were more proficient at scoring runs in various areas around the cricket field (according to the wagon wheel analysis). This study found that a LBBT is more common at the highest levels of cricket batsmanship. Cricket coaches should also pay attention to the direction of the backlift with players, especially when correlating the backlift to various scoring areas on the cricket field. Further in-depth research is required to fully investigate the change in batting backlift techniques among cricket players over a long-term period.

Keywords: cricket batting, biomechanical analysis, backlift, performance

Procedia PDF Downloads 258
28768 Autonomic Threat Avoidance and Self-Healing in Database Management System

Authors: Wajahat Munir, Muhammad Haseeb, Adeel Anjum, Basit Raza, Ahmad Kamran Malik

Abstract:

Databases are the key components of the software systems. Due to the exponential growth of data, it is the concern that the data should be accurate and available. The data in databases is vulnerable to internal and external threats, especially when it contains sensitive data like medical or military applications. Whenever the data is changed by malicious intent, data analysis result may lead to disastrous decisions. Autonomic self-healing is molded toward computer system after inspiring from the autonomic system of human body. In order to guarantee the accuracy and availability of data, we propose a technique which on a priority basis, tries to avoid any malicious transaction from execution and in case a malicious transaction affects the system, it heals the system in an isolated mode in such a way that the availability of system would not be compromised. Using this autonomic system, the management cost and time of DBAs can be minimized. In the end, we test our model and present the findings.

Keywords: autonomic computing, self-healing, threat avoidance, security

Procedia PDF Downloads 498
28767 Information Extraction Based on Search Engine Results

Authors: Mohammed R. Elkobaisi, Abdelsalam Maatuk

Abstract:

The search engines are the large scale information retrieval tools from the Web that are currently freely available to all. This paper explains how to convert the raw resulted number of search engines into useful information. This represents a new method for data gathering comparing with traditional methods. When a query is submitted for a multiple numbers of keywords, this take a long time and effort, hence we develop a user interface program to automatic search by taking multi-keywords at the same time and leave this program to collect wanted data automatically. The collected raw data is processed using mathematical and statistical theories to eliminate unwanted data and converting it to usable data.

Keywords: search engines, information extraction, agent system

Procedia PDF Downloads 421
28766 Art Beyond Borders: Virtual School Field Trips

Authors: Audrey Hudson

Abstract:

In 2020, educational field trips went virtual for all students. At the Art Gallery of Ontario (AGO) in Canada, our solution was to create a virtual school program that addressed three pillars of access—economic, geographic and cultural—with art at the center. Now, at the close of three years, we have reached 1.6 million students! Exponentially more than we have ever welcomed for onsite school visits. In 2022, we partnered with the Museum of Modern Art (MoMA), the Hong Kong University Museum and the National Gallery of Singapore, which has pushed the boundaries of art education into the expanse of the global community. Looking forward to our fourth year of the program, we are using the platform of technology to expand our program of art, access and learning to a global platform. In 2023/24, we intend to connect across more borders to expand the pedagogical benefits of art education for a global community. We invite you to listen to how you can join this journey.

Keywords: technology, museums, art education, pedagogy

Procedia PDF Downloads 56
28765 Implementation and Performance Analysis of Data Encryption Standard and RSA Algorithm with Image Steganography and Audio Steganography

Authors: S. C. Sharma, Ankit Gambhir, Rajeev Arya

Abstract:

In today’s era data security is an important concern and most demanding issues because it is essential for people using online banking, e-shopping, reservations etc. The two major techniques that are used for secure communication are Cryptography and Steganography. Cryptographic algorithms scramble the data so that intruder will not able to retrieve it; however steganography covers that data in some cover file so that presence of communication is hidden. This paper presents the implementation of Ron Rivest, Adi Shamir, and Leonard Adleman (RSA) Algorithm with Image and Audio Steganography and Data Encryption Standard (DES) Algorithm with Image and Audio Steganography. The coding for both the algorithms have been done using MATLAB and its observed that these techniques performed better than individual techniques. The risk of unauthorized access is alleviated up to a certain extent by using these techniques. These techniques could be used in Banks, RAW agencies etc, where highly confidential data is transferred. Finally, the comparisons of such two techniques are also given in tabular forms.

Keywords: audio steganography, data security, DES, image steganography, intruder, RSA, steganography

Procedia PDF Downloads 282
28764 Data Monetisation by E-commerce Companies: A Need for a Regulatory Framework in India

Authors: Anushtha Saxena

Abstract:

This paper examines the process of data monetisation bye-commerce companies operating in India. Data monetisation is collecting, storing, and analysing consumers’ data to use further the data that is generated for profits, revenue, etc. Data monetisation enables e-commerce companies to get better businesses opportunities, innovative products and services, a competitive edge over others to the consumers, and generate millions of revenues. This paper analyses the issues and challenges that are faced due to the process of data monetisation. Some of the issues highlighted in the paper pertain to the right to privacy, protection of data of e-commerce consumers. At the same time, data monetisation cannot be prohibited, but it can be regulated and monitored by stringent laws and regulations. The right to privacy isa fundamental right guaranteed to the citizens of India through Article 21 of The Constitution of India. The Supreme Court of India recognized the Right to Privacy as a fundamental right in the landmark judgment of Justice K.S. Puttaswamy (Retd) and Another v. Union of India . This paper highlights the legal issue of how e-commerce businesses violate individuals’ right to privacy by using the data collected, stored by them for economic gains and monetisation and protection of data. The researcher has mainly focused on e-commerce companies like online shopping websitesto analyse the legal issue of data monetisation. In the Internet of Things and the digital age, people have shifted to online shopping as it is convenient, easy, flexible, comfortable, time-consuming, etc. But at the same time, the e-commerce companies store the data of their consumers and use it by selling to the third party or generating more data from the data stored with them. This violatesindividuals’ right to privacy because the consumers do not know anything while giving their data online. Many times, data is collected without the consent of individuals also. Data can be structured, unstructured, etc., that is used by analytics to monetise. The Indian legislation like The Information Technology Act, 2000, etc., does not effectively protect the e-consumers concerning their data and how it is used by e-commerce businesses to monetise and generate revenues from that data. The paper also examines the draft Data Protection Bill, 2021, pending in the Parliament of India, and how this Bill can make a huge impact on data monetisation. This paper also aims to study the European Union General Data Protection Regulation and how this legislation can be helpful in the Indian scenarioconcerning e-commerce businesses with respect to data monetisation.

Keywords: data monetization, e-commerce companies, regulatory framework, GDPR

Procedia PDF Downloads 110
28763 Purchasing Decision-Making in Supply Chain Management: A Bibliometric Analysis

Authors: Ahlem Dhahri, Waleed Omri, Audrey Becuwe, Abdelwahed Omri

Abstract:

In industrial processes, decision-making ranges across different scales, from process control to supply chain management. The purchasing decision-making process in the supply chain is presently gaining more attention as a critical contributor to the company's strategic success. Given the scarcity of thorough summaries in the prior studies, this bibliometric analysis aims to adopt a meticulous approach to achieve quantitative knowledge on the constantly evolving subject of purchasing decision-making in supply chain management. Through bibliometric analysis, we examine a sample of 358 peer-reviewed articles from the Scopus database. VOSviewer and Gephi software were employed to analyze, combine, and visualize the data. Data analytic techniques, including citation network, page-rank analysis, co-citation, and publication trends, have been used to identify influential works and outline the discipline's intellectual structure. The outcomes of this descriptive analysis highlight the most prominent articles, authors, journals, and countries based on their citations and publications. The findings from the research illustrate an increase in the number of publications, exhibiting a slightly growing trend in this field. Co-citation analysis coupled with content analysis of the most cited articles identified five research themes mentioned as follows integrating sustainability into the supplier selection process, supplier selection under disruption risks assessment and mitigation strategies, Fuzzy MCDM approaches for supplier evaluation and selection, purchasing decision in vendor problems, decision-making techniques in supplier selection and order lot sizing problems. With the help of a graphic timeline, this exhaustive map of the field illustrates a visual representation of the evolution of publications that demonstrate a gradual shift from research interest in vendor selection problems to integrating sustainability in the supplier selection process. These clusters offer insights into a wide variety of purchasing methods and conceptual frameworks that have emerged; however, they have not been validated empirically. The findings suggest that future research would emerge with a greater depth of practical and empirical analysis to enrich the theories. These outcomes provide a powerful road map for further study in this area.

Keywords: bibliometric analysis, citation analysis, co-citation, Gephi, network analysis, purchasing, SCM, VOSviewer

Procedia PDF Downloads 80
28762 Mixed Traffic Speed–Flow Behavior under Influence of Road Side Friction and Non-Motorized Vehicles: A Comparative Study of Arterial Roads in India

Authors: Chetan R. Patel, G. J. Joshi

Abstract:

The present study is carried out on six lane divided urban arterial road in Patna and Pune city of India. Both the road having distinct differences in terms of the vehicle composition and the road side parking. Arterial road in Patan city has 33% of non-motorized mode, whereas Pune arterial road dominated by 65% of Two wheeler. Also road side parking is observed in Patna city. The field studies using vidiographic techniques are carried out for traffic data collection. Data are extracted for one minute duration for vehicle composition, speed variation and flow rate on selected arterial road of the two cities. Speed flow relationship is developed and capacity is determine. Equivalency factor in terms of dynamic car unit is determine to represent the vehicle is single unit. The variation in the capacity due to side friction, presence of non motorized traffic and effective utilization of lane width is compared at concluding remarks.

Keywords: arterial road, capacity, dynamic equivalency factor, effect of non motorized mode, side friction

Procedia PDF Downloads 345
28761 Climate Change Effects of Vehicular Carbon Monoxide Emission from Road Transportation in Part of Minna Metropolis, Niger State, Nigeria

Authors: H. M. Liman, Y. M. Suleiman A. A. David

Abstract:

Poor air quality often considered one of the greatest environmental threats facing the world today is caused majorly by the emission of carbon monoxide into the atmosphere. The principal air pollutant is carbon monoxide. One prominent source of carbon monoxide emission is the transportation sector. Not much was known about the emission levels of carbon monoxide, the primary pollutant from the road transportation in the study area. Therefore, this study assessed the levels of carbon monoxide emission from road transportation in the Minna, Niger State. The database shows the carbon monoxide data collected. MSA Altair gas alert detector was used to take the carbon monoxide emission readings in Parts per Million for the peak and off-peak periods of vehicular movement at the road intersections. Their Global Positioning System (GPS) coordinates were recorded in the Universal Transverse Mercator (UTM). Bar chart graphs were plotted by using the emissions level of carbon dioxide as recorded on the field against the scientifically established internationally accepted safe limit of 8.7 Parts per Million of carbon monoxide in the atmosphere. Further statistical analysis was also carried out on the data recorded from the field using the Statistical Package for Social Sciences (SPSS) software and Microsoft excel to show the variance of the emission levels of each of the parameters in the study area. The results established that emissions’ level of atmospheric carbon monoxide from the road transportation in the study area exceeded the internationally accepted safe limits of 8.7 parts per million. In addition, the variations in the average emission levels of CO between the four parameters showed that morning peak is having the highest average emission level of 24.5PPM followed by evening peak with 22.84PPM while morning off peak is having 15.33 and the least is evening off peak 12.94PPM. Based on these results, recommendations made for poor air quality mitigation via carbon monoxide emissions reduction from transportation include Introduction of the urban mass transit would definitely reduce the number of traffic on the roads, hence the emissions from several vehicles that would have been on the road. This would also be a cheaper means of transportation for the masses and Encouraging the use of vehicles using alternative sources of energy like solar, electric and biofuel will also result in less emission levels as the these alternative energy sources other than fossil fuel originated diesel and petrol vehicles do not emit especially carbon monoxide.

Keywords: carbon monoxide, climate change emissions, road transportation, vehicular

Procedia PDF Downloads 373
28760 Experiments on Weakly-Supervised Learning on Imperfect Data

Authors: Yan Cheng, Yijun Shao, James Rudolph, Charlene R. Weir, Beth Sahlmann, Qing Zeng-Treitler

Abstract:

Supervised predictive models require labeled data for training purposes. Complete and accurate labeled data, i.e., a ‘gold standard’, is not always available, and imperfectly labeled data may need to serve as an alternative. An important question is if the accuracy of the labeled data creates a performance ceiling for the trained model. In this study, we trained several models to recognize the presence of delirium in clinical documents using data with annotations that are not completely accurate (i.e., weakly-supervised learning). In the external evaluation, the support vector machine model with a linear kernel performed best, achieving an area under the curve of 89.3% and accuracy of 88%, surpassing the 80% accuracy of the training sample. We then generated a set of simulated data and carried out a series of experiments which demonstrated that models trained on imperfect data can (but do not always) outperform the accuracy of the training data, e.g., the area under the curve for some models is higher than 80% when trained on the data with an error rate of 40%. Our experiments also showed that the error resistance of linear modeling is associated with larger sample size, error type, and linearity of the data (all p-values < 0.001). In conclusion, this study sheds light on the usefulness of imperfect data in clinical research via weakly-supervised learning.

Keywords: weakly-supervised learning, support vector machine, prediction, delirium, simulation

Procedia PDF Downloads 190
28759 Purity Monitor Studies in Medium Liquid Argon TPC

Authors: I. Badhrees

Abstract:

This paper is an attempt to describe some of the results that had been found through a journey of study in the field of particle physics. This study consists of two parts, one about the measurement of the cross section of the decay of the Z particle in two electrons, and the other deals with the measurement of the cross section of the multi-photon absorption process using a beam of laser in the Liquid Argon Time Projection Chamber. The first part of the paper concerns the results based on the analysis of a data sample containing 8120 ee candidates to reconstruct the mass of the Z particle for each event where each event has an ee pair with PT(e) > 20GeV, and η(e) < 2.5. Monte Carlo templates of the reconstructed Z particle were produced as a function of the Z mass scale. The distribution of the reconstructed Z mass in the data was compared to the Monte Carlo templates, where the total cross section is calculated to be equal to 1432 pb. The second part concerns the Liquid Argon Time Projection Chamber, LAr TPC, the results of the interaction of the UV Laser, Nd-YAG with λ= 266mm, with LAr and through the study of the multi-photon ionization process as a part of the R&D at Bern University. The main result of this study was the cross section of the process of the multi-photon ionization process of the LAr, σe = 1.24±0.10stat±0.30sys.10 -56cm4.

Keywords: ATLAS, CERN, KACST, LArTPC, particle physics

Procedia PDF Downloads 342
28758 Potential Field Functions for Motion Planning and Posture of the Standard 3-Trailer System

Authors: K. Raghuwaiya, S. Singh, B. Sharma, J. Vanualailai

Abstract:

This paper presents a set of artificial potential field functions that improves upon; in general, the motion planning and posture control, with theoretically guaranteed point and posture stabilities, convergence and collision avoidance properties of 3-trailer systems in a priori known environment. We basically design and inject two new concepts; ghost walls and the Distance Optimization Technique (DOT) to strengthen point and posture stabilities, in the sense of Lyapunov, of our dynamical model. This new combination of techniques emerges as a convenient mechanism for obtaining feasible orientations at the target positions with an overall reduction in the complexity of the navigation laws. The effectiveness of the proposed control laws were demonstrated via simulations of two traffic scenarios.

Keywords: artificial potential fields, 3-trailer systems, motion planning, posture, parking and collision, free trajectories

Procedia PDF Downloads 373
28757 The High Precision of Magnetic Detection with Microwave Modulation in Solid Spin Assembly of NV Centres in Diamond

Authors: Zongmin Ma, Shaowen Zhang, Yueping Fu, Jun Tang, Yunbo Shi, Jun Liu

Abstract:

Solid-state quantum sensors are attracting wide interest because of their high sensitivity at room temperature. In particular, spin properties of nitrogen–vacancy (NV) color centres in diamond make them outstanding sensors of magnetic fields, electric fields and temperature under ambient conditions. Much of the work on NV magnetic sensing has been done so as to achieve the smallest volume, high sensitivity of NV ensemble-based magnetometry using micro-cavity, light-trapping diamond waveguide (LTDW), nano-cantilevers combined with MEMS (Micro-Electronic-Mechanical System) techniques. Recently, frequency-modulated microwaves with continuous optical excitation method have been proposed to achieve high sensitivity of 6 μT/√Hz using individual NV centres at nanoscale. In this research, we built-up an experiment to measure static magnetic field through continuous wave optical excitation with frequency-modulated microwaves method under continuous illumination with green pump light at 532 nm, and bulk diamond sample with a high density of NV centers (1 ppm). The output of the confocal microscopy was collected by an objective (NA = 0.7) and detected by a high sensitivity photodetector. We design uniform and efficient excitation of the micro strip antenna, which is coupled well with the spin ensembles at 2.87 GHz for zero-field splitting of the NV centers. Output of the PD signal was sent to an LIA (Lock-In Amplifier) modulated signal, generated by the microwave source by IQ mixer. The detected signal is received by the photodetector, and the reference signal enters the lock-in amplifier to realize the open-loop detection of the NV atomic magnetometer. We can plot ODMR spectra under continuous-wave (CW) microwave. Due to the high sensitivity of the lock-in amplifier, the minimum detectable value of the voltage can be measured, and the minimum detectable frequency can be made by the minimum and slope of the voltage. The magnetic field sensitivity can be derived from η = δB√T corresponds to a 10 nT minimum detectable shift in the magnetic field. Further, frequency analysis of the noise in the system indicates that at 10Hz the sensitivity less than 10 nT/√Hz.

Keywords: nitrogen-vacancy (NV) centers, frequency-modulated microwaves, magnetic field sensitivity, noise density

Procedia PDF Downloads 432
28756 Operating Speed Models on Tangent Sections of Two-Lane Rural Roads

Authors: Dražen Cvitanić, Biljana Maljković

Abstract:

This paper presents models for predicting operating speeds on tangent sections of two-lane rural roads developed on continuous speed data. The data corresponds to 20 drivers of different ages and driving experiences, driving their own cars along an 18 km long section of a state road. The data were first used for determination of maximum operating speeds on tangents and their comparison with speeds in the middle of tangents i.e. speed data used in most of operating speed studies. Analysis of continuous speed data indicated that the spot speed data are not reliable indicators of relevant speeds. After that, operating speed models for tangent sections were developed. There was no significant difference between models developed using speed data in the middle of tangent sections and models developed using maximum operating speeds on tangent sections. All developed models have higher coefficient of determination then models developed on spot speed data. Thus, it can be concluded that the method of measuring has more significant impact on the quality of operating speed model than the location of measurement.

Keywords: operating speed, continuous speed data, tangent sections, spot speed, consistency

Procedia PDF Downloads 450
28755 A Neural Network Based Clustering Approach for Imputing Multivariate Values in Big Data

Authors: S. Nickolas, Shobha K.

Abstract:

The treatment of incomplete data is an important step in the data pre-processing. Missing values creates a noisy environment in all applications and it is an unavoidable problem in big data management and analysis. Numerous techniques likes discarding rows with missing values, mean imputation, expectation maximization, neural networks with evolutionary algorithms or optimized techniques and hot deck imputation have been introduced by researchers for handling missing data. Among these, imputation techniques plays a positive role in filling missing values when it is necessary to use all records in the data and not to discard records with missing values. In this paper we propose a novel artificial neural network based clustering algorithm, Adaptive Resonance Theory-2(ART2) for imputation of missing values in mixed attribute data sets. The process of ART2 can recognize learned models fast and be adapted to new objects rapidly. It carries out model-based clustering by using competitive learning and self-steady mechanism in dynamic environment without supervision. The proposed approach not only imputes the missing values but also provides information about handling the outliers.

Keywords: ART2, data imputation, clustering, missing data, neural network, pre-processing

Procedia PDF Downloads 271
28754 Effect of One-Period of SEAS Exercises on Some Spinal Biomechanical and Postural Parameters in the Students with Idiopathic Scoliosis

Authors: Zandi Ahmad, Sokhanguei Yahya, Saboonchi Reza

Abstract:

Objective: The new and modern lifestyle, especially in the twenty-first century and lack of movement in spinal structure have made patients and the physicians in the field of health and also other insurance companies in the developed and developing countries worry more than before about the abnormalities of spinal column- this great healthcare problem. The high prevalence of spinal column in all age groups -from children to adults- and in all professions have led the researchers to the idea of giving an opportunity to all those who worry about the dangers threatening the spinal column. Therefore, one of the corrective methods for these patients is using SEAS exercises. Materials and Methods: This study aims at investigating the effect of one-period of SEAS exercises on some spinal biomechanical and postural parameters in the students with idiopathic scoliosis. According to the nature of the study and research objectives as well as the data collection methods, the current research is a semi-empirical survey. The research population is comprised of students with idiopathic scoliosis. A total number of 30 students were selected using available sampling and divided into two groups of control and SEAS exercises. Scoliometer was used for data collection. Descriptive statistics were used to categorize the findings. Kolmogorov-Smirnov statistical models were used to confirm that the distribution of the data is normal and T-test was used for effectiveness. Hypothesis testing was done using SPSS21. Conclusion: Results show that SEAS exercises have a significant effect in Adam’s Test. Therefore, according to the obtained results, SEAS exercises can be used to recover idiopathic scoliosis among the students. Further studies in larger samples and treatment, periods as well as more follow-up investigations appear to be essential to prove these effects.

Keywords: SEAS exercises, idiopathic scoliosis, Adam’s test, exercise

Procedia PDF Downloads 279
28753 Flux-Gate vs. Anisotropic Magneto Resistance Magnetic Sensors Characteristics in Closed-Loop Operation

Authors: Neoclis Hadjigeorgiou, Spyridon Angelopoulos, Evangelos V. Hristoforou, Paul P. Sotiriadis

Abstract:

The increasing demand for accurate and reliable magnetic measurements over the past decades has paved the way for the development of different types of magnetic sensing systems as well as of more advanced measurement techniques. Anisotropic Magneto Resistance (AMR) sensors have emerged as a promising solution for applications requiring high resolution, providing an ideal balance between performance and cost. However, certain issues of AMR sensors such as non-linear response and measurement noise are rarely discussed in the relevant literature. In this work, an analog closed loop compensation system is proposed, developed and tested as a means to eliminate the non-linearity of AMR response, reduce the 1/f noise and enhance the sensitivity of magnetic sensor. Additional performance aspects, such as cross-axis and hysteresis effects are also examined. This system was analyzed using an analytical model and a P-Spice model, considering both the sensor itself as well as the accompanying electronic circuitry. In addition, a commercial closed loop architecture Flux-Gate sensor (calibrated and certified), has been used for comparison purposes. Three different experimental setups have been constructed for the purposes of this work, each one utilized for DC magnetic field measurements, AC magnetic field measurements and Noise density measurements respectively. The DC magnetic field measurements have been conducted in laboratory environment employing a cubic Helmholtz coil setup in order to calibrate and characterize the system under consideration. A high-accuracy DC power supply has been used for providing the operating current to the Helmholtz coils. The results were recorded by a multichannel voltmeter The AC magnetic field measurements have been conducted in laboratory environment employing a cubic Helmholtz coil setup in order to examine the effective bandwidth not only of the proposed system but also for the Flux-Gate sensor. A voltage controlled current source driven by a function generator has been utilized for the Helmholtz coil excitation. The result was observed by the oscilloscope. The third experimental apparatus incorporated an AC magnetic shielding construction composed of several layers of electric steel that had been demagnetized prior to the experimental process. Each sensor was placed alone and the response was captured by the oscilloscope. The preliminary experimental results indicate that closed loop AMR response presented a maximum deviation of 0.36% with respect to the ideal linear response, while the corresponding values for the open loop AMR system and the Fluxgate sensor reached 2% and 0.01% respectively. Moreover, the noise density of the proposed close loop AMR sensor system remained almost as low as the noise density of the AMR sensor itself, yet considerably higher than that of the Flux-Gate sensor. All relevant numerical data are presented in the paper.

Keywords: AMR sensor, chopper, closed loop, electronic noise, magnetic noise, memory effects, flux-gate sensor, linearity improvement, sensitivity improvement

Procedia PDF Downloads 417
28752 Empirical Evidence to Beliefs and Perceptions About Mental Health Disorder and Substance Abuse: The Role of a Social Worker

Authors: Helena Baffoe

Abstract:

Context: In the United States, there have been significant advancements in programs aimed at improving the lives of individuals with mental health disorders and substance abuse problems. However, public attitudes and beliefs regarding these issues have not improved correspondingly. This study aims to explore the perceptions and beliefs surrounding mental health disorders and substance abuse in the context of data analytics in the field of social work. Research Aim: The aim of this research is to provide empirical evidence on the beliefs and perceptions regarding mental health disorders and substance abuse. Specifically, the study seeks to answer the question of whether being diagnosed with a mental disorder implies a diagnosis of substance abuse. Additionally, the research aims to analyze the specific roles that social workers can play in addressing individuals with mental disorders. Methodology: This research adopts a data-driven methodology, acquiring comprehensive data from the Substance Abuse and Mental Health Services Administration (SAMHSA). A noteworthy causal connection between mental disorders and substance abuse exists, a relationship that current literature tends to overlook critically. To address this gap, we applied logistic regression with an Instrumental Variable approach, effectively mitigating potential endogeneity issues in the analysis in order to ensure robust and unbiased results. This methodology allows for a rigorous examination of the relationship between mental disorders and substance abuse. Empirical Findings: The analysis of the data reveals that depressive, anxiety, and trauma/stressor mental disorders are the most common in the United States. However, the study does not find statistically significant evidence to support the notion that being diagnosed with these mental disorders necessarily implies a diagnosis of substance abuse. This suggests that there is a misconception among the public regarding the relationship between mental health disorders and substance abuse. Theoretical Importance: The research contributes to the existing body of literature by providing empirical evidence to challenge prevailing beliefs and perceptions regarding mental health disorders and substance abuse. By using a novel methodological approach and analyzing new US data, the study sheds light on the cultural and social factors that influence these attitudes.

Keywords: mental health disorder, substance abuse, empirical evidence, logistic regression with IV

Procedia PDF Downloads 59
28751 Positive Affect, Negative Affect, Organizational and Motivational Factor on the Acceptance of Big Data Technologies

Authors: Sook Ching Yee, Angela Siew Hoong Lee

Abstract:

Big data technologies have become a trend to exploit business opportunities and provide valuable business insights through the analysis of big data. However, there are still many organizations that have yet to adopt big data technologies especially small and medium organizations (SME). This study uses the technology acceptance model (TAM) to look into several constructs in the TAM and other additional constructs which are positive affect, negative affect, organizational factor and motivational factor. The conceptual model proposed in the study will be tested on the relationship and influence of positive affect, negative affect, organizational factor and motivational factor towards the intention to use big data technologies to produce an outcome. Empirical research is used in this study by conducting a survey to collect data.

Keywords: big data technologies, motivational factor, negative affect, organizational factor, positive affect, technology acceptance model (TAM)

Procedia PDF Downloads 353
28750 Linguistic Features for Sentence Difficulty Prediction in Aspect-Based Sentiment Analysis

Authors: Adrian-Gabriel Chifu, Sebastien Fournier

Abstract:

One of the challenges of natural language understanding is to deal with the subjectivity of sentences, which may express opinions and emotions that add layers of complexity and nuance. Sentiment analysis is a field that aims to extract and analyze these subjective elements from text, and it can be applied at different levels of granularity, such as document, paragraph, sentence, or aspect. Aspect-based sentiment analysis is a well-studied topic with many available data sets and models. However, there is no clear definition of what makes a sentence difficult for aspect-based sentiment analysis. In this paper, we explore this question by conducting an experiment with three data sets: ”Laptops”, ”Restaurants”, and ”MTSC” (Multi-Target-dependent Sentiment Classification), and a merged version of these three datasets. We study the impact of domain diversity and syntactic diversity on difficulty. We use a combination of classifiers to identify the most difficult sentences and analyze their characteristics. We employ two ways of defining sentence difficulty. The first one is binary and labels a sentence as difficult if the classifiers fail to correctly predict the sentiment polarity. The second one is a six-level scale based on how many of the top five best-performing classifiers can correctly predict the sentiment polarity. We also define 9 linguistic features that, combined, aim at estimating the difficulty at sentence level.

Keywords: sentiment analysis, difficulty, classification, machine learning

Procedia PDF Downloads 74
28749 Therapeutic Application of Light and Electromagnetic Fields to Reduce Hyper-Inflammation Triggered by COVID-19

Authors: Blanche Aguida, Marootpong Pooam, Nathalie Jourdan, Margaret Ahmad

Abstract:

COVID-19-related morbidity is associated with exaggerated inflammation and cytokine production in the lungs, leading to acute respiratory failure. The cellular mechanisms underlying these so-called ‘cytokine storms’ are regulated through the Toll-like receptor 4 (TLR4) signaling pathway and by reactive oxygen species (ROS). Both light (photobiomodulation) and magnetic fields (e.g., pulsed electromagnetic field) stimulation are non-invasive therapies known to confer anti-inflammatory effects and regulate ROS signaling pathways. Here we show that daily exposure to two 10-minute intervals of moderate-intensity infra-red light significantly lowered the inflammatory response induced via the TLR4 receptor signaling pathway in human cell cultures. Anti-inflammatory effects were likewise achieved by electromagnetic field exposure of cells to daily 10-minute intervals of either pulsed electromagnetic fields (PEMF) or to low-level static magnetic fields. Because current illumination and electromagnetic field therapies have no known side effects and are already approved for some medical uses, we have here developed protocols for verification in clinical trials of COVID 19 infection. These treatments are affordable, simple to implement, and may help to resolve the acute respiratory distress of COVID 19 patients both in the home and in the hospital.

Keywords: COVID 19, electromagnetic fields therapy, inflammation, photobiomodulation therapy

Procedia PDF Downloads 137
28748 Big Data Analysis with Rhipe

Authors: Byung Ho Jung, Ji Eun Shin, Dong Hoon Lim

Abstract:

Rhipe that integrates R and Hadoop environment made it possible to process and analyze massive amounts of data using a distributed processing environment. In this paper, we implemented multiple regression analysis using Rhipe with various data sizes of actual data. Experimental results for comparing the performance of our Rhipe with stats and biglm packages available on bigmemory, showed that our Rhipe was more fast than other packages owing to paralleling processing with increasing the number of map tasks as the size of data increases. We also compared the computing speeds of pseudo-distributed and fully-distributed modes for configuring Hadoop cluster. The results showed that fully-distributed mode was faster than pseudo-distributed mode, and computing speeds of fully-distributed mode were faster as the number of data nodes increases.

Keywords: big data, Hadoop, Parallel regression analysis, R, Rhipe

Procedia PDF Downloads 493
28747 Patent on Brian: Brain Waves Stimulation

Authors: Jalil Qoulizadeh, Hasan Sadeghi

Abstract:

Brain waves are electrical wave patterns that are produced in the human brain. Knowing these waves and activating them can have a positive effect on brain function and ultimately create an ideal life. The brain has the ability to produce waves from 0.1 to above 65 Hz. (The Beta One device produces exactly these waves) This is because it is said that the waves produced by the Beta One device exactly match the waves produced by the brain. The function and method of this device is based on the magnetic stimulation of the brain. The technology used in the design and producƟon of this device works in a way to strengthen and improve the frequencies of brain waves with a pre-defined algorithm according to the type of requested function, so that the person can access the expected functions in life activities. to perform better. The effect of this field on neurons and their stimulation: In order to evaluate the effect of this field created by the device, on the neurons, the main tests are by conducting electroencephalography before and after stimulation and comparing these two baselines by qEEG or quantitative electroencephalography method using paired t-test in 39 subjects. It confirms the significant effect of this field on the change of electrical activity recorded after 30 minutes of stimulation in all subjects. The Beta One device is able to induce the appropriate pattern of the expected functions in a soft and effective way to the brain in a healthy and effective way (exactly in accordance with the harmony of brain waves), the process of brain activities first to a normal state and then to a powerful one. Production of inexpensive neuroscience equipment (compared to existing rTMS equipment) Magnetic brain stimulation for clinics - homes - factories and companies - professional sports clubs.

Keywords: stimulation, brain, waves, betaOne

Procedia PDF Downloads 74
28746 Security in Resource Constraints Network Light Weight Encryption for Z-MAC

Authors: Mona Almansoori, Ahmed Mustafa, Ahmad Elshamy

Abstract:

Wireless sensor network was formed by a combination of nodes, systematically it transmitting the data to their base stations, this transmission data can be easily compromised if the limited processing power and the data consistency from these nodes are kept in mind; there is always a discussion to address the secure data transfer or transmission in actual time. This will present a mechanism to securely transmit the data over a chain of sensor nodes without compromising the throughput of the network by utilizing available battery resources available in the sensor node. Our methodology takes many different advantages of Z-MAC protocol for its efficiency, and it provides a unique key by sharing the mechanism using neighbor node MAC address. We present a light weighted data integrity layer which is embedded in the Z-MAC protocol to prove that our protocol performs well than Z-MAC when we introduce the different attack scenarios.

Keywords: hybrid MAC protocol, data integrity, lightweight encryption, neighbor based key sharing, sensor node dataprocessing, Z-MAC

Procedia PDF Downloads 136