Search results for: missing data estimation
25174 Continuous Adaptive Robust Control for Non-Linear Uncertain Systems
Authors: Dong Sang Yoo
Abstract:
We consider nonlinear uncertain systems such that a priori information of the uncertainties is not available. For such systems, we assume that the upper bound of the uncertainties is represented as a Fredholm integral equation of the first kind and we propose an adaptation law that is capable of estimating the upper bound and design a continuous robust control which renders nonlinear uncertain systems ultimately bounded.Keywords: adaptive control, estimation, Fredholm integral, uncertain system
Procedia PDF Downloads 48525173 A Mutually Exclusive Task Generation Method Based on Data Augmentation
Authors: Haojie Wang, Xun Li, Rui Yin
Abstract:
In order to solve the memorization overfitting in the model-agnostic meta-learning MAML algorithm, a method of generating mutually exclusive tasks based on data augmentation is proposed. This method generates a mutex task by corresponding one feature of the data to multiple labels so that the generated mutex task is inconsistent with the data distribution in the initial dataset. Because generating mutex tasks for all data will produce a large number of invalid data and, in the worst case, lead to an exponential growth of computation, this paper also proposes a key data extraction method that only extract part of the data to generate the mutex task. The experiments show that the method of generating mutually exclusive tasks can effectively solve the memorization overfitting in the meta-learning MAML algorithm.Keywords: mutex task generation, data augmentation, meta-learning, text classification.
Procedia PDF Downloads 14425172 ESL Material Evaluation: The Missing Link in Nigerian Classrooms
Authors: Abdulkabir Abdullahi
Abstract:
The paper is a pre-use evaluation of grammar activities in three primary English course books (two of which are international primary English course books and the other a popular Nigerian primary English course book). The titles are - Cambridge Global English, Collins International Primary English, and Nigeria Primary English – Primary English. Grammar points and grammar activities in the three-course books were identified, grouped, and evaluated. The grammar activity which was most common in the course books, simple past tense, was chosen for evaluation, and the units which present simple past tense activities were selected to evaluate the extent to which the treatment of simple past tense in each of the course books help the young learners of English as a second language in Nigeria, aged 8 – 11, level A1 to A2, who lack the basic grammatical knowledge, to know grammar/communicate effectively. A bespoke checklist was devised, through the modification of existing checklists for the purpose of the evaluation, to evaluate the extent to which the grammar activities promote the communicative effectiveness of Nigerian learners of English as a second language. The results of the evaluation and the analysis of the data reveal that the treatment of grammar, especially the treatment of the simple past tense, is evidently insufficient. While Cambridge Global English’s, and Collins International Primary English’s treatment of grammar, the simple past tense, is underpinned by state-of-the-art theories of learning, language learning theories, second language learning principles, second language curriculum-syllabus design principles, grammar learning and teaching theories, the grammar load is insignificantly low, and the grammar tasks do not promote creative grammar practice sufficiently. Nigeria Primary English – Primary English, on the other hand, treats grammar, the simple past tense, in the old-fashioned direct way. The book does not favour the communicative language teaching approach; no opportunity for learners to notice and discover grammar rules for themselves, and the book lacks the potency to promote creative grammar practice. The research and its findings, therefore, underscore the need to improve grammar contents and increase grammar activity types which engage learners effectively and promote sufficient creative grammar practice in EFL and ESL material design and development.Keywords: evaluation, activity, second language, activity-types, creative grammar practice
Procedia PDF Downloads 8425171 Revolutionizing Traditional Farming Using Big Data/Cloud Computing: A Review on Vertical Farming
Authors: Milind Chaudhari, Suhail Balasinor
Abstract:
Due to massive deforestation and an ever-increasing population, the organic content of the soil is depleting at a much faster rate. Due to this, there is a big chance that the entire food production in the world will drop by 40% in the next two decades. Vertical farming can help in aiding food production by leveraging big data and cloud computing to ensure plants are grown naturally by providing the optimum nutrients sunlight by analyzing millions of data points. This paper outlines the most important parameters in vertical farming and how a combination of big data and AI helps in calculating and analyzing these millions of data points. Finally, the paper outlines how different organizations are controlling the indoor environment by leveraging big data in enhancing food quantity and quality.Keywords: big data, IoT, vertical farming, indoor farming
Procedia PDF Downloads 17625170 Developing Critical-Process Skills Integrated Assessment Instrument as Alternative Assessment on Electrolyte Solution Matter in Senior High School
Authors: Sri Rejeki Dwi Astuti, Suyanta
Abstract:
The demanding of the asessment in learning process was impact by policy changes. Nowadays, the assessment not only emphasizes knowledge, but also skills and attitude. However, in reality there are many obstacles in measuring them. This paper aimed to describe how to develop instrument of integrated assessment as alternative assessment to measure critical thinking skills and science process skills in electrolyte solution and to describe instrument’s characteristic such as logic validity and construct validity. This instrument development used test development model by McIntire. Development process data was acquired based on development test step and was analyzed by qualitative analysis. Initial product was observed by three peer reviewer and six expert judgment (two subject matter expert, two evaluation expert and two chemistry teacher) to acquire logic validity test. Logic validity test was analyzed using Aiken’s formula. The estimation of construct validity was analyzed by exploratory factor analysis. Result showed that integrated assessment instrument has 0,90 of Aiken’s Value and all item in integrated assessment asserted valid according to construct validity.Keywords: construct validity, critical thinking skills, integrated assessment instrument, logic validity, science process skills
Procedia PDF Downloads 26425169 An In-Depth Study on the Experience of Novice Teachers
Authors: Tsafi Timor
Abstract:
The research focuses on the exploration of the unique journey that novice teachers experience in their first year of teaching, among graduates of re-training programs into teaching. The study explores the experiences of success and failure and the factors that underpin positive experiences, as well as the journey (process) of this year with reference to the comparison between novice teachers and new immigrants. The content analysis that was adopted in the study was conducted on texts that were written by the teachers and detailed their first year of teaching. The findings indicate that experiences of success are featured by personal satisfaction, constant need of feedback, high motivation in challenging situations, and emotions. Failure experiences are featured by frustration, helplessness, sense of humiliation, feeling of rejection, and lack of efficacy. Factors that promote and inhibit positive experiences relate to personal, personality, professional and organizational levels. Most teachers reported feeling like new immigrants, and demonstrated different models of the process of the first year of teaching. Further research is recommended on the factors that promote and inhibit positive experiences, and on 'The Missing Link' of the relationship between Teacher Education Programs and the practices in schools.Keywords: first-year teaching, novice teachers, school practice, teacher education programs
Procedia PDF Downloads 29225168 Flood Mapping Using Height above the Nearest Drainage Model: A Case Study in Fredericton, NB, Canada
Authors: Morteza Esfandiari, Shabnam Jabari, Heather MacGrath, David Coleman
Abstract:
Flood is a severe issue in different places in the world as well as the city of Fredericton, New Brunswick, Canada. The downtown area of Fredericton is close to the Saint John River, which is susceptible to flood around May every year. Recently, the frequency of flooding seems to be increased, especially after the fact that the downtown area and surrounding urban/agricultural lands got flooded in two consecutive years in 2018 and 2019. In order to have an explicit vision of flood span and damage to affected areas, it is necessary to use either flood inundation modelling or satellite data. Due to contingent availability and weather dependency of optical satellites, and limited existing data for the high cost of hydrodynamic models, it is not always feasible to rely on these sources of data to generate quality flood maps after or during the catastrophe. Height Above the Nearest Drainage (HAND), a state-of-the-art topo-hydrological index, normalizes the height of a basin based on the relative elevation along with the stream network and specifies the gravitational or the relative drainage potential of an area. HAND is a relative height difference between the stream network and each cell on a Digital Terrain Model (DTM). The stream layer is provided through a multi-step, time-consuming process which does not always result in an optimal representation of the river centerline depending on the topographic complexity of that region. HAND is used in numerous case studies with quite acceptable and sometimes unexpected results because of natural and human-made features on the surface of the earth. Some of these features might cause a disturbance in the generated model, and consequently, the model might not be able to predict the flow simulation accurately. We propose to include a previously existing stream layer generated by the province of New Brunswick and benefit from culvert maps to improve the water flow simulation and accordingly the accuracy of HAND model. By considering these parameters in our processing, we were able to increase the accuracy of the model from nearly 74% to almost 92%. The improved model can be used for generating highly accurate flood maps, which is necessary for future urban planning and flood damage estimation without any need for satellite imagery or hydrodynamic computations.Keywords: HAND, DTM, rapid floodplain, simplified conceptual models
Procedia PDF Downloads 15325167 Does Trade and Institutional Quality Play Any Significant Role on Environmental Quality in Sub-Saharan Africa?
Authors: Luqman Afolabi
Abstract:
This paper measures the impacts of trade and institutions on environmental quality in Sub-Saharan Africa (SSA). To examine the direction and the magnitude of the effects, the study employs the pooled mean group (PMG) estimation technique on the panel data obtained from the World Bank’s World Development and Governance Indicators, between 1996 and 2018. The empirical estimates validate the environmental Kuznets curve hypothesis (EKC) for the region, even though there have been inconclusive results on the environment – growth nexus. Similarly, a positive coefficient is obtained on the impact of trade on the environment, while the impact of the institutional indicators produce mixed results. A significant policy implication is that the governments of the SSA countries pursue policies that tend to increase economic growth, so that pollutants may be reduced. Such policies may include the provision of incentives for sustainable growth-driven industries in the region. In addition, the governance infrastructures should be improved in such a way that appropriate penalties are imposed on the pollutants, while advanced technologies that have the potentials to reduce environmental degradation should be encouraged. Finally, it is imperative from these findings that the governments of the region should promote their trade relations and the competitiveness of their local industries in order to keep pace with the global markets.Keywords: environmental quality, institutional quality sustainable development goals, trade
Procedia PDF Downloads 14425166 Clinical Audit on the Introduction of Apremilast into Ireland
Authors: F. O’Dowd, G. Murphy, M. Roche, E. Shudell, F. Keane, M. O’Kane
Abstract:
Intoduction: Apremilast (Otezla®) is an oral phosphodiesterase-4 (PDE4) inhibitor indicated for treatment of adult patients with moderate to severe plaque psoriasis who have contraindications to have failed or intolerant of standard systemic therapy and/or phototherapy; and adult patients with active psoriatic arthritis. Apremilast influences intracellular regulation of inflammatory mediators. Two randomized, placebo-controlled trials evaluating apremilast in 1426 patients with moderate to severe plague psoriasis (ESTEEM 1 and 2) demonstrated that the commonest adverse reactions (AE’s) leading to discontinuation were nausea (1.6%), diarrhoea (1.0%), and headaches (0.8%). The overall proportion of subjects discontinuing due to adverse reactions was 6.1%. At week 16 these trials demonstrated significant more apremilast-treated patients (33.1%) achieved the primary end point PASI-75 than placebo (5.3%). We began prescribing apremilast in July 2015. Aim: To evaluate efficacy and tolerability of apremilast in an Irish teaching hospital psoriasis population. Methods: A proforma documenting clinical evaluation parameters, prior treatment experience and AE’s; was completed prospectively on all patients commenced on apremilast since July 2015 – July 2017. Data was collected at week 0,6,12,24,36 and week 52 with 20/71 patients having passed week 52. Efficacy was assessed using Psoriasis Area and Severity Index (PASI) and Dermatology Life Quality Index (DLQI). AE’s documented included GI effects, infections, changes in weight and mood. Retrospective chart review and telephone review was utilised for missing data. Results: A total of 71 adult subjects (38 male, 33 female; age range 23-57), with moderate to severe psoriasis, were evaluated. Prior treatment: 37/71 (52%) were systemic/biologic/phototherapy naïve; 14/71 (20%) has prior phototherapy alone;20/71 (28%) had previous systemic/biologic exposure; 12/71 (17%) had both psoriasis and psoriatic arthritis. PASI responses: mean baseline PASI was 10.1 and DLQI was 15.Week 6: N=71, n=15 (21%) achieved PASI 75. Week 12: N= 48, n=6 (13%) achieved a PASI 100%; n=16 (34.5%) achieved a PASI 75. Week 24: N=40, n=10 (25%) achieved a PASI 100; n=15 (37.5%) achieved a PASI 75. Week 52: N= 20, n=4 (20%) achieved a PASI 100; n= 16 (80%) achieved a PASI 75. (N= number of pts having passed the time point indicated, n= number of pts (out of N) achieving PASI or DLQI responses at that time). DLQI responses: week 24: N= 40, n=30 (75%) achieved a DLQI score of 0; n=5 (12.5%) achieved a DLQI score of 1; n=1 (2.5%) achieved a DLQI score of 10 (due to lack of efficacy). Adverse Events: The proportion of patients that discontinued treatment due to AE’s was n=7 (9.8%). One patient experienced nausea alleviated by dose reduction; another developed significant dysgeusia for certain foods, both continued therapy. Two patients lost 2-3 kg. Conclusion: Initial Irish patient experience of Apremilast appears comparable to that observed in trials with good efficacy and tolerability.Keywords: Apremilast, introduction, Ireland, clinical audit
Procedia PDF Downloads 14925165 Big Data-Driven Smart Policing: Big Data-Based Patrol Car Dispatching in Abu Dhabi, UAE
Authors: Oualid Walid Ben Ali
Abstract:
Big Data has become one of the buzzwords today. The recent explosion of digital data has led the organization, either private or public, to a new era towards a more efficient decision making. At some point, business decided to use that concept in order to learn what make their clients tick with phrases like ‘sales funnel’ analysis, ‘actionable insights’, and ‘positive business impact’. So, it stands to reason that Big Data was viewed through green (read: money) colored lenses. Somewhere along the line, however someone realized that collecting and processing data doesn’t have to be for business purpose only, but also could be used for other purposes to assist law enforcement or to improve policing or in road safety. This paper presents briefly, how Big Data have been used in the fields of policing order to improve the decision making process in the daily operation of the police. As example, we present a big-data driven system which is sued to accurately dispatch the patrol cars in a geographic environment. The system is also used to allocate, in real-time, the nearest patrol car to the location of an incident. This system has been implemented and applied in the Emirate of Abu Dhabi in the UAE.Keywords: big data, big data analytics, patrol car allocation, dispatching, GIS, intelligent, Abu Dhabi, police, UAE
Procedia PDF Downloads 49125164 Channel Sounding and PAPR Reduction in OFDM for WiMAX Using Software Defined Radio
Authors: B. Siva Kumar Reddy, B. Lakshmi
Abstract:
WiMAX is a high speed broadband wireless access technology that adopted OFDM/OFDMA techniques to supply higher data rates with high spectral efficiency. However, OFDM suffers in view of high Peak to Average Power Ratio (PAPR) and high affect to synchronization errors. In this paper, the high PAPR problem is solved by using phase modulation to get Constant Envelop Orthogonal Frequency Division Multiplexing (CE-OFDM). The synchronization failures are brought down by employing a frequency lock loop, Poly phase clock synchronizer, Costas loop and blind equalizers such as Constant Modulus Algorithm (CMA) equalizer and Sign Kurtosis Maximization Adaptive Algorithm (SKMAA) equalizers. The WiMAX physical layer is executed on Software Defined Radio (SDR) prototype by utilizing USRP N210 as hardware and GNU Radio as software plat-forms. A SNR estimation is performed on the signal received through USRP N210. To empathize wireless propagation in specific environments, a sliding correlator wireless channel sounding system is designed by using SDR testbed.Keywords: BER, CMA equalizer, Kurtosis equalizer, GNU Radio, OFDM/OFDMA, USRP N210
Procedia PDF Downloads 35025163 Communication Barriers in Disaster Risk Management
Authors: Pooja Pandey
Abstract:
The role of communication plays an integral part in the management of any disaster, whether natural or human-induced, both require effective and strategic delivery of information. The way any information is conveyed carries the most weight while dealing with the disaster. Hence, integrating communication strategies in disaster risk management (DRM) are extensively acknowledged however, these integration and planning are missing from the practical books. Researchers are continuously exploring integrated DRM and have established substantial vents between research and implementation of the strategies (gaps between science and policy). For this reason, this paper reviews the communication barriers that obstruct effective management of the disaster. Communication between first responders (government agencies, police, medical services) and the public (people directly affected by the disaster) is most critical and lacks proper delivery during a disaster. And these challenges can only be resolved if the foundation of the problem is properly dealt with, which is resolving the issues within the organizations. Through this study, it was found that it is necessary to build the communication gap between the organizations themselves as most of the hindrances occur during the mitigation, preparedness, response and recovery phase of the disaster. The study is concluded with the main aim to review the communication barriers within and at the organizational, technological, and social levels that impact effective DRM. In the end, some suggestions are made to strengthen the knowledge for future improvement in communication between the responders and their organizations.Keywords: communication, organization, barriers, first responders, disaster risk management
Procedia PDF Downloads 30225162 Mining Multicity Urban Data for Sustainable Population Relocation
Authors: Xu Du, Aparna S. Varde
Abstract:
In this research, we propose to conduct diagnostic and predictive analysis about the key factors and consequences of urban population relocation. To achieve this goal, urban simulation models extract the urban development trends as land use change patterns from a variety of data sources. The results are treated as part of urban big data with other information such as population change and economic conditions. Multiple data mining methods are deployed on this data to analyze nonlinear relationships between parameters. The result determines the driving force of population relocation with respect to urban sprawl and urban sustainability and their related parameters. Experiments so far reveal that data mining methods discover useful knowledge from the multicity urban data. This work sets the stage for developing a comprehensive urban simulation model for catering to specific questions by targeted users. It contributes towards achieving sustainability as a whole.Keywords: data mining, environmental modeling, sustainability, urban planning
Procedia PDF Downloads 31025161 Kernel-Based Double Nearest Proportion Feature Extraction for Hyperspectral Image Classification
Authors: Hung-Sheng Lin, Cheng-Hsuan Li
Abstract:
Over the past few years, kernel-based algorithms have been widely used to extend some linear feature extraction methods such as principal component analysis (PCA), linear discriminate analysis (LDA), and nonparametric weighted feature extraction (NWFE) to their nonlinear versions, kernel principal component analysis (KPCA), generalized discriminate analysis (GDA), and kernel nonparametric weighted feature extraction (KNWFE), respectively. These nonlinear feature extraction methods can detect nonlinear directions with the largest nonlinear variance or the largest class separability based on the given kernel function. Moreover, they have been applied to improve the target detection or the image classification of hyperspectral images. The double nearest proportion feature extraction (DNP) can effectively reduce the overlap effect and have good performance in hyperspectral image classification. The DNP structure is an extension of the k-nearest neighbor technique. For each sample, there are two corresponding nearest proportions of samples, the self-class nearest proportion and the other-class nearest proportion. The term “nearest proportion” used here consider both the local information and other more global information. With these settings, the effect of the overlap between the sample distributions can be reduced. Usually, the maximum likelihood estimator and the related unbiased estimator are not ideal estimators in high dimensional inference problems, particularly in small data-size situation. Hence, an improved estimator by shrinkage estimation (regularization) is proposed. Based on the DNP structure, LDA is included as a special case. In this paper, the kernel method is applied to extend DNP to kernel-based DNP (KDNP). In addition to the advantages of DNP, KDNP surpasses DNP in the experimental results. According to the experiments on the real hyperspectral image data sets, the classification performance of KDNP is better than that of PCA, LDA, NWFE, and their kernel versions, KPCA, GDA, and KNWFE.Keywords: feature extraction, kernel method, double nearest proportion feature extraction, kernel double nearest feature extraction
Procedia PDF Downloads 34625160 Evaluation of Orthodontic Patients’ Dental Visits and Problems During Covid-19 Pandemic in Sari Dental School in 2021
Authors: Mobina Bagherianlemraski, Parastoo Namdar
Abstract:
Background: The ongoing coronavirus disease has affected most countries. This virus has high transmission power. Due to the closure of most dental clinics, millions of orthodontic patients missed their appointments during the COVID-19 pandemic. Methods: A questionnaire was developed and sent to patients receiving orthodontic treatment at a public or private clinic. Results: A total of 200 responses were analyzed: These included 153 women (76.5%) and 47 men (23.5%). The mean and standard deviation of their age was 18.92±7.23 years, with an age range of 8 to 40 years. One hundred eighty-nine patients (94.5%) had fixed appliances, and 11 patients (5.5%) had removable appliances. Of all participants, 35% (70) missed their appointments. The highest and lowest reasons for stopping appointments were concerned about the spread of COVID-19 with 28 cases (40%) and the closure of the clinic with 15 cases (21.4%). Of the 53 patients who contacted their orthodontists, 86.8 % (46) communicated via office phone and 5.7% (3) through social media. Conclusion: This study determined that the coronavirus pandemic and quarantine have had an important impact on orthodontic treatments. The greatest concern of orthodontic patients was increasing in treatment duration. Patients who used fixed appliances reported missing dental appointments more than others. Therefore, during COVID 19 Pandemic, orthodontists should prepare patients to solve their problems linked to orthodontic appliances when possible.Keywords: orthodontic patients, coronavirus pandemic, appointments, COVID-19
Procedia PDF Downloads 13925159 Model Order Reduction for Frequency Response and Effect of Order of Method for Matching Condition
Authors: Aref Ghafouri, Mohammad javad Mollakazemi, Farhad Asadi
Abstract:
In this paper, model order reduction method is used for approximation in linear and nonlinearity aspects in some experimental data. This method can be used for obtaining offline reduced model for approximation of experimental data and can produce and follow the data and order of system and also it can match to experimental data in some frequency ratios. In this study, the method is compared in different experimental data and influence of choosing of order of the model reduction for obtaining the best and sufficient matching condition for following the data is investigated in format of imaginary and reality part of the frequency response curve and finally the effect and important parameter of number of order reduction in nonlinear experimental data is explained further.Keywords: frequency response, order of model reduction, frequency matching condition, nonlinear experimental data
Procedia PDF Downloads 40425158 Economic Valuation of Emissions from Mobile Sources in the Urban Environment of Bogotá
Authors: Dayron Camilo Bermudez Mendoza
Abstract:
Road transportation is a significant source of externalities, notably in terms of environmental degradation and the emission of pollutants. These emissions adversely affect public health, attributable to criteria pollutants like particulate matter (PM2.5 and PM10) and carbon monoxide (CO), and also contribute to climate change through the release of greenhouse gases, such as carbon dioxide (CO2). It is, therefore, crucial to quantify the emissions from mobile sources and develop a methodological framework for their economic valuation, aiding in the assessment of associated costs and informing policy decisions. The forthcoming congress will shed light on the externalities of transportation in Bogotá, showcasing methodologies and findings from the construction of emission inventories and their spatial analysis within the city. This research focuses on the economic valuation of emissions from mobile sources in Bogotá, employing methods like hedonic pricing and contingent valuation. Conducted within the urban confines of Bogotá, the study leverages demographic, transportation, and emission data sourced from the Mobility Survey, official emission inventories, and tailored estimates and measurements. The use of hedonic pricing and contingent valuation methodologies facilitates the estimation of the influence of transportation emissions on real estate values and gauges the willingness of Bogotá's residents to invest in reducing these emissions. The findings are anticipated to be instrumental in the formulation and execution of public policies aimed at emission reduction and air quality enhancement. In compiling the emission inventory, innovative data sources were identified to determine activity factors, including information from automotive diagnostic centers and used vehicle sales websites. The COPERT model was utilized to ascertain emission factors, requiring diverse inputs such as data from the national transit registry (RUNT), OpenStreetMap road network details, climatological data from the IDEAM portal, and Google API for speed analysis. Spatial disaggregation employed GIS tools and publicly available official spatial data. The development of the valuation methodology involved an exhaustive systematic review, utilizing platforms like the EVRI (Environmental Valuation Reference Inventory) portal and other relevant sources. The contingent valuation method was implemented via surveys in various public settings across the city, using a referendum-style approach for a sample of 400 residents. For the hedonic price valuation, an extensive database was developed, integrating data from several official sources and basing analyses on the per-square meter property values in each city block. The upcoming conference anticipates the presentation and publication of these results, embodying a multidisciplinary knowledge integration and culminating in a master's thesis.Keywords: economic valuation, transport economics, pollutant emissions, urban transportation, sustainable mobility
Procedia PDF Downloads 5925157 An Empirical Study of the Impacts of Big Data on Firm Performance
Authors: Thuan Nguyen
Abstract:
In the present time, data to a data-driven knowledge-based economy is the same as oil to the industrial age hundreds of years ago. Data is everywhere in vast volumes! Big data analytics is expected to help firms not only efficiently improve performance but also completely transform how they should run their business. However, employing the emergent technology successfully is not easy, and assessing the roles of big data in improving firm performance is even much harder. There was a lack of studies that have examined the impacts of big data analytics on organizational performance. This study aimed to fill the gap. The present study suggested using firms’ intellectual capital as a proxy for big data in evaluating its impact on organizational performance. The present study employed the Value Added Intellectual Coefficient method to measure firm intellectual capital, via its three main components: human capital efficiency, structural capital efficiency, and capital employed efficiency, and then used the structural equation modeling technique to model the data and test the models. The financial fundamental and market data of 100 randomly selected publicly listed firms were collected. The results of the tests showed that only human capital efficiency had a significant positive impact on firm profitability, which highlighted the prominent human role in the impact of big data technology.Keywords: big data, big data analytics, intellectual capital, organizational performance, value added intellectual coefficient
Procedia PDF Downloads 24625156 Automated Test Data Generation For some types of Algorithm
Authors: Hitesh Tahbildar
Abstract:
The cost of test data generation for a program is computationally very high. In general case, no algorithm to generate test data for all types of algorithms has been found. The cost of generating test data for different types of algorithm is different. Till date, people are emphasizing the need to generate test data for different types of programming constructs rather than different types of algorithms. The test data generation methods have been implemented to find heuristics for different types of algorithms. Some algorithms that includes divide and conquer, backtracking, greedy approach, dynamic programming to find the minimum cost of test data generation have been tested. Our experimental results say that some of these types of algorithm can be used as a necessary condition for selecting heuristics and programming constructs are sufficient condition for selecting our heuristics. Finally we recommend the different heuristics for test data generation to be selected for different types of algorithms.Keywords: ongest path, saturation point, lmax, kL, kS
Procedia PDF Downloads 40825155 Design and Evaluation of a Prototype for Non-Invasive Screening of Diabetes – Skin Impedance Technique
Authors: Pavana Basavakumar, Devadas Bhat
Abstract:
Diabetes is a disease which often goes undiagnosed until its secondary effects are noticed. Early detection of the disease is necessary to avoid serious consequences which could lead to the death of the patient. Conventional invasive tests for screening of diabetes are mostly painful, time consuming and expensive. There’s also a risk of infection involved, therefore it is very essential to develop non-invasive methods to screen and estimate the level of blood glucose. Extensive research is going on with this perspective, involving various techniques that explore optical, electrical, chemical and thermal properties of the human body that directly or indirectly depend on the blood glucose concentration. Thus, non-invasive blood glucose monitoring has grown into a vast field of research. In this project, an attempt was made to device a prototype for screening of diabetes by measuring electrical impedance of the skin and building a model to predict a patient’s condition based on the measured impedance. The prototype developed, passes a negligible amount of constant current (0.5mA) across a subject’s index finger through tetra polar silver electrodes and measures output voltage across a wide range of frequencies (10 KHz – 4 MHz). The measured voltage is proportional to the impedance of the skin. The impedance was acquired in real-time for further analysis. Study was conducted on over 75 subjects with permission from the institutional ethics committee, along with impedance, subject’s blood glucose values were also noted, using conventional method. Nonlinear regression analysis was performed on the features extracted from the impedance data to obtain a model that predicts blood glucose values for a given set of features. When the predicted data was depicted on Clarke’s Error Grid, only 58% of the values predicted were clinically acceptable. Since the objective of the project was to screen diabetes and not actual estimation of blood glucose, the data was classified into three classes ‘NORMAL FASTING’,’NORMAL POSTPRANDIAL’ and ‘HIGH’ using linear Support Vector Machine (SVM). Classification accuracy obtained was 91.4%. The developed prototype was economical, fast and pain free. Thus, it can be used for mass screening of diabetes.Keywords: Clarke’s error grid, electrical impedance of skin, linear SVM, nonlinear regression, non-invasive blood glucose monitoring, screening device for diabetes
Procedia PDF Downloads 32625154 National Digital Soil Mapping Initiatives in Europe: A Review and Some Examples
Authors: Dominique Arrouays, Songchao Chen, Anne C. Richer-De-Forges
Abstract:
Soils are at the crossing of many issues such as food and water security, sustainable energy, climate change mitigation and adaptation, biodiversity protection, human health and well-being. They deliver many ecosystem services that are essential to life on Earth. Therefore, there is a growing demand for soil information on a national and global scale. Unfortunately, many countries do not have detailed soil maps, and, when existing, these maps are generally based on more or less complex and often non-harmonized soil classifications. An estimate of their uncertainty is also often missing. Thus, there are not easy to understand and often not properly used by end-users. Therefore, there is an urgent need to provide end-users with spatially exhaustive grids of essential soil properties, together with an estimate of their uncertainty. One way to achieve this is digital soil mapping (DSM). The concept of DSM relies on the hypothesis that soils and their properties are not randomly distributed, but that they depend on the main soil-forming factors that are climate, organisms, relief, parent material, time (age), and position in space. All these forming factors can be approximated using several exhaustive spatial products such as climatic grids, remote sensing products or vegetation maps, digital elevation models, geological or lithological maps, spatial coordinates of soil information, etc. Thus, DSM generally relies on models calibrated with existing observed soil data (point observations or maps) and so-called “ancillary co-variates” that come from other available spatial products. Then the model is generalized on grids where soil parameters are unknown in order to predict them, and the prediction performances are validated using various methods. With the growing demand for soil information at a national and global scale and the increase of available spatial co-variates national and continental DSM initiatives are continuously increasing. This short review illustrates the main national and continental advances in Europe, the diversity of the approaches and the databases that are used, the validation techniques and the main scientific and other issues. Examples from several countries illustrate the variety of products that were delivered during the last ten years. The scientific production on this topic is continuously increasing and new models and approaches are developed at an incredible speed. Most of the digital soil mapping (DSM) products rely mainly on machine learning (ML) prediction models and/or the use or pedotransfer functions (PTF) in which calibration data come from soil analyses performed in labs or for existing conventional maps. However, some scientific issues remain to be solved and also political and legal ones related, for instance, to data sharing and to different laws in different countries. Other issues related to communication to end-users and education, especially on the use of uncertainty. Overall, the progress is very important and the willingness of institutes and countries to join their efforts is increasing. Harmonization issues are still remaining, mainly due to differences in classifications or in laboratory standards between countries. However numerous initiatives are ongoing at the EU level and also at the global level. All these progress are scientifically stimulating and also promissing to provide tools to improve and monitor soil quality in countries, EU and at the global level.Keywords: digital soil mapping, global soil mapping, national and European initiatives, global soil mapping products, mini-review
Procedia PDF Downloads 18425153 Issues on Optimizing the Structural Parameters of the Induction Converter
Authors: Marinka K. Baghdasaryan, Siranush M. Muradyan, Avgen A. Gasparyan
Abstract:
Analytical expressions of the current and angular errors, as well as the frequency characteristics of an induction converter describing the relation with its structural parameters, the core and winding characteristics are obtained. Based on estimation of the dependences obtained, a mathematical problem of parametric optimization is formulated which can successfully be used for investigation and diagnosing an induction converter.Keywords: induction converters, magnetic circuit material, current and angular errors, frequency response, mathematical formulation, structural parameters
Procedia PDF Downloads 34525152 The Perspective on Data Collection Instruments for Younger Learners
Authors: Hatice Kübra Koç
Abstract:
For academia, collecting reliable and valid data is one of the most significant issues for researchers. However, it is not the same procedure for all different target groups; meanwhile, during data collection from teenagers, young adults, or adults, researchers can use common data collection tools such as questionnaires, interviews, and semi-structured interviews; yet, for young learners and very young ones, these reliable and valid data collection tools cannot be easily designed or applied by the researchers. In this study, firstly, common data collection tools are examined for ‘very young’ and ‘young learners’ participant groups since it is thought that the quality and efficiency of an academic study is mainly based on its valid and correct data collection and data analysis procedure. Secondly, two different data collection instruments for very young and young learners are stated as discussing the efficacy of them. Finally, a suggested data collection tool – a performance-based questionnaire- which is specifically developed for ‘very young’ and ‘young learners’ participant groups in the field of teaching English to young learners as a foreign language is presented in this current study. The designing procedure and suggested items/factors for the suggested data collection tool are accordingly revealed at the end of the study to help researchers have studied with young and very learners.Keywords: data collection instruments, performance-based questionnaire, young learners, very young learners
Procedia PDF Downloads 9425151 Optimization Model for Identification of Assembly Alternatives of Large-Scale, Make-to-Order Products
Authors: Henrik Prinzhorn, Peter Nyhuis, Johannes Wagner, Peter Burggräf, Torben Schmitz, Christina Reuter
Abstract:
Assembling large-scale products, such as airplanes, locomotives, or wind turbines, involves frequent process interruptions induced by e.g. delayed material deliveries or missing availability of resources. This leads to a negative impact on the logistical performance of a producer of xxl-products. In industrial practice, in case of interruptions, the identification, evaluation and eventually the selection of an alternative order of assembly activities (‘assembly alternative’) leads to an enormous challenge, especially if an optimized logistical decision should be reached. Therefore, in this paper, an innovative, optimization model for the identification of assembly alternatives that addresses the given problem is presented. It describes make-to-order, large-scale product assembly processes as a resource constrained project scheduling (RCPS) problem which follows given restrictions in practice. For the evaluation of the assembly alternative, a cost-based definition of the logistical objectives (delivery reliability, inventory, make-span and workload) is presented.Keywords: assembly scheduling, large-scale products, make-to-order, optimization, rescheduling
Procedia PDF Downloads 45925150 Generating Swarm Satellite Data Using Long Short-Term Memory and Generative Adversarial Networks for the Detection of Seismic Precursors
Authors: Yaxin Bi
Abstract:
Accurate prediction and understanding of the evolution mechanisms of earthquakes remain challenging in the fields of geology, geophysics, and seismology. This study leverages Long Short-Term Memory (LSTM) networks and Generative Adversarial Networks (GANs), a generative model tailored to time-series data, for generating synthetic time series data based on Swarm satellite data, which will be used for detecting seismic anomalies. LSTMs demonstrated commendable predictive performance in generating synthetic data across multiple countries. In contrast, the GAN models struggled to generate synthetic data, often producing non-informative values, although they were able to capture the data distribution of the time series. These findings highlight both the promise and challenges associated with applying deep learning techniques to generate synthetic data, underscoring the potential of deep learning in generating synthetic electromagnetic satellite data.Keywords: LSTM, GAN, earthquake, synthetic data, generative AI, seismic precursors
Procedia PDF Downloads 3425149 Dynamic Externalities and Regional Productivity Growth: Evidence from Manufacturing Industries of India and China
Authors: Veerpal Kaur
Abstract:
The present paper aims at investigating the role of dynamic externalities of agglomeration in the regional productivity growth of manufacturing sector in India and China. Taking 2-digit level manufacturing sector data of states and provinces of India and China respectively for the period of 1998-99 to 2011-12, this paper examines the effect of dynamic externalities namely – Marshall-Arrow-Romer (MAR) specialization externalities, Jacobs’s diversity externalities, and Porter’s competition externalities on regional total factor productivity growth (TFPG) of manufacturing sector in both economies. Regressions have been carried on pooled data for all 2-digit manufacturing industries for India and China separately. The estimation of Panel has been based on a fixed effect by sector model. The results of econometric exercise show that labour-intensive industries in Indian regional manufacturing benefit from diversity externalities and capital intensive industries gain more from specialization in terms of TFPG. In China, diversity externalities and competition externalities hold better prospectus for regional TFPG in both labour intensive and capital intensive industries. But if we look at results for coastal and non-coastal region separately, specialization tends to assert a positive effect on TFPG in coastal regions whereas it has a negative effect on TFPG of coastal regions. Competition externalities put a negative effect on TFPG of non-coastal regions whereas it has a positive effect on TFPG of coastal regions. Diversity externalities made a positive contribution to TFPG in both coastal and non-coastal regions. So the results of the study postulate that the importance of dynamic externalities should not be examined by pooling all industries and all regions together. This could hold differential implications for region specific and industry-specific policy formulation. Other important variables explaining regional level TFPG in both India and China have been the availability of infrastructure, level of competitiveness, foreign direct investment, exports and geographical location of the region (especially in China).Keywords: China, dynamic externalities, India, manufacturing, productivity
Procedia PDF Downloads 12325148 Evaluating the Prominence of Chemical Phenomena in Chemistry Courses
Authors: Vanessa R. Ralph, Leah J. Scharlott, Megan Y. Deshaye, Ryan L. Stowe
Abstract:
Given the traditions of chemistry teaching, one may not question whether chemical phenomena play a prominent role. Yet, the role of chemical phenomena in an introductory chemistry course may define the extent to which the course is introductory, chemistry, and equitable. Picture, for example, the classic Ideal Gas Law problem. If one envisions a prompt wherein students are tasked with calculating a missing variable, then one envisions a prompt that relies on chemical phenomena as a context rather than as a model to understand the natural world. Consider a prompt wherein students are tasked with applying molecular models of gases to explain why the vapor pressure of a gaseous solution of water differs from that of carbon dioxide. Here, the chemical phenomenon is not only the context but also the subject of the prompt. Deliveries of general and organic chemistry were identified as ranging wildly in the integration of chemical phenomena. The more incorporated the phenomena, the more equitable the assessment task was for students of varying access to pre-college math and science preparation. How chemical phenomena are integrated may very well define whether courses are chemistry, are introductory, and are equitable. Educators of chemistry are invited colleagues to discuss the role of chemical phenomena in their courses and consider the long-lasting impacts of replicating tradition for tradition’s sake.Keywords: equitable educational practices, chemistry curriculum, content organization, assessment design
Procedia PDF Downloads 19825147 Molecular Diversity of Forensically Relevant Insects from the Cadavers of Lahore
Authors: Sundus Mona, Atif Adnan, Babar Ali, Fareeha Arshad, Allah Rakha
Abstract:
Molecular diversity is the variation in the abundance of species. Forensic entomology is a neglected field in Pakistan. Insects collected from the crime scene should be handled by forensic entomologists who are currently virtually non-existent in Pakistan. Correct identification of insect specimen along with knowledge of their biodiversity can aid in solving many problems related to complicated forensic cases. Inadequate morphological identification and insufficient thermal biological studies limit the entomological utility in Forensic Medicine. Recently molecular identification of entomological evidence has gained attention globally. DNA barcoding is the latest and established method for species identification. Only proper identification can provide a precise estimation of postmortem intervals. Arthropods are known to be the first tourists scavenging on decomposing dead matter. The objective of the proposed study was to identify species by molecular techniques and analyze their phylogenetic importance with barcoded necrophagous insect species of early succession on human cadavers. Based upon this identification, the study outcomes will be the utilization of established DNA bar codes to identify carrion feeding insect species for concordant estimation of post mortem interval. A molecular identification method involving sequencing of a 658bp ‘barcode’ fragment of the mitochondrial cytochrome oxidase subunit 1 (CO1) gene from collected specimens of unknown dipteral species from cadavers of Lahore was evaluated. Nucleotide sequence divergences were calculated using MEGA 7 and Arlequin, and a neighbor-joining phylogenetic tree was generated. Three species were identified, Chrysomya megacephala, Chrysomya saffranea, and Chrysomya rufifacies with low genetic diversity. The fixation index was 0.83992 that suggests a need for further studies to identify and classify forensically relevant insects in Pakistan. There is an exigency demand for further research especially when immature forms of arthropods are recovered from the crime scene.Keywords: molecular diversity, DNA barcoding, species identification, forensically relevant
Procedia PDF Downloads 15025146 Production and Leftovers Usage Policies to Minimize Food Waste under Uncertain and Correlated Demand
Authors: Esma Birisci, Ronald McGarvey
Abstract:
One of the common problems in food service industry is demand uncertainty. This research presents a multi-criteria optimization approach to identify the efficient frontier of points lying between the minimum-waste and minimum-shortfall solutions within uncertain demand environment. It also addresses correlation across demands for items (e.g., hamburgers are often demanded with french fries). Reducing overproduction food waste (and its corresponding environmental impacts) and an aversion to shortfalls (leave some customer hungry) need to consider as two contradictory objectives in an all-you-care-to-eat environment food service operation. We identify optimal production adjustments relative to demand forecasts, demand thresholds for utilization of leftovers, and percentages of demand to be satisfied by leftovers, considering two alternative metrics for overproduction waste: mass; and greenhouse gas emissions. Demand uncertainty and demand correlations are addressed using a kernel density estimation approach. A statistical analysis of the changes in decision variable values across each of the efficient frontiers can then be performed to identify the key variables that could be modified to reduce the amount of wasted food at minimal increase in shortfalls. We illustrate our approach with an application to empirical data from Campus Dining Services operations at the University of Missouri.Keywords: environmental studies, food waste, production planning, uncertain and correlated demand
Procedia PDF Downloads 37425145 Generation of Quasi-Measurement Data for On-Line Process Data Analysis
Authors: Hyun-Woo Cho
Abstract:
For ensuring the safety of a manufacturing process one should quickly identify an assignable cause of a fault in an on-line basis. To this end, many statistical techniques including linear and nonlinear methods have been frequently utilized. However, such methods possessed a major problem of small sample size, which is mostly attributed to the characteristics of empirical models used for reference models. This work presents a new method to overcome the insufficiency of measurement data in the monitoring and diagnosis tasks. Some quasi-measurement data are generated from existing data based on the two indices of similarity and importance. The performance of the method is demonstrated using a real data set. The results turn out that the presented methods are able to handle the insufficiency problem successfully. In addition, it is shown to be quite efficient in terms of computational speed and memory usage, and thus on-line implementation of the method is straightforward for monitoring and diagnosis purposes.Keywords: data analysis, diagnosis, monitoring, process data, quality control
Procedia PDF Downloads 483