Search results for: common platform for automated programming
6942 Create a Model of Production and Marketing Strategies in Alignment with Business Strategy Using QFD Approach
Authors: Hamed Saremi, Shahla Saremi
Abstract:
In today's competitive world, organizations are expected to surpass the competitors and benefit from the resources and benefits. Therefore, organizations need to improve the current performance is felt more than ever that this requires to identify organizational optimal strategies, and consider all strategies simultaneously. In this study, to enhance competitive advantage and according to customer requirements, alignment between business, production and marketing strategies, House of Quality (QFD) approach has been used and zero-one linear programming model has been studied. First, the alignment between production and marketing strategies with business strategy, independent weights of these strategies is calculated. Then with using QFD approach the aligned weights of optimal strategies in each production and marketing field will be obtained and finally the aligned marketing strategies selection with the purpose of allocating budget and specialist human resource to marketing functions will be done that lead to increasing competitive advantage and benefit.Keywords: marketing strategy, business strategy, strategy alignment, house of quality deployment, production strategy
Procedia PDF Downloads 6056941 Development of a Pain Detector Using Microwave Radiometry Method
Authors: Nanditha Rajamani, Anirudhaa R. Rao, Divya Sriram
Abstract:
One of the greatest difficulties in treating patients with pain is the highly subjective nature of pain sensation. The measurement of pain intensity is primarily dependent on the patient’s report, often with little physical evidence to provide objective corroboration. This is also complicated by the fact that there are only few and expensive existing technologies (Functional Magnetic Resonance Imaging-fMRI). The need is thus clear and urgent for a reliable, non-invasive, non-painful, objective, readily adoptable, and coefficient diagnostic platform that provides additional diagnostic information to supplement its current regime with more information to assist doctors in diagnosing these patients. Thus, our idea of developing a pain detector was conceived to take a step further the detection and diagnosis of chronic and acute pain.Keywords: pain sensor, microwave radiometery, pain sensation, fMRI
Procedia PDF Downloads 4566940 DSPIC30F6010A Control for 12/8 Switched Reluctance Motor
Authors: Yang Zhou, Chen Hao, Ma Xiaoping
Abstract:
This paper briefly mentions the micro controller unit, and then goes into details about the exact regulations for SRM. Firstly, it proposes the main driving state control for motor and the importance of the motor position sensor. For different speed, the controller will choice various styles such as voltage chopper control, angle position control and current chopper control for which owns its advantages and disadvantages. Combining the strengths of the three discrepant methods, the main control chip will intelligently select the best performing control depending on the load and speed demand. Then the exact flow diagram is showed in paper. At last, an experimental platform is established to verify the correctness of the proposed theory.Keywords: switched reluctance motor, dspic microcontroller, current chopper
Procedia PDF Downloads 4256939 Importance of New Policies of Process Management for Internet of Things Based on Forensic Investigation
Authors: Venkata Venugopal Rao Gudlur
Abstract:
The Proposed Policies referred to as “SOP”, on the Internet of Things (IoT) based Forensic Investigation into Process Management is the latest revolution to save time and quick solution for investigators. The forensic investigation process has been developed over many years from time to time it has been given the required information with no policies in investigation processes. This research reveals that the current IoT based forensic investigation into Process Management based is more connected to devices which is the latest revolution and policies. All future development in real-time information on gathering monitoring is evolved with smart sensor-based technologies connected directly to IoT. This paper present conceptual framework on process management. The smart devices are leading the way in terms of automated forensic models and frameworks established by different scholars. These models and frameworks were mostly focused on offering a roadmap for performing forensic operations with no policies in place. These initiatives would bring a tremendous benefit to process management and IoT forensic investigators proposing policies. The forensic investigation process may enhance more security and reduced data losses and vulnerabilities.Keywords: Internet of Things, Process Management, Forensic Investigation, M2M Framework
Procedia PDF Downloads 1026938 Optimization Model for Support Decision for Maximizing Production of Mixed Fresh Fruit Farms
Authors: Andrés I. Ávila, Patricia Aros, César San Martín, Elizabeth Kehr, Yovana Leal
Abstract:
Planning models for fresh products is a very useful tool for improving the net profits. To get an efficient supply chain model, several functions should be considered to get a complete simulation of several operational units. We consider a linear programming model to help farmers to decide if it is convenient to choose what area should be planted for three kinds of export fruits considering their future investment. We consider area, investment, water, productivity minimal unit, and harvest restrictions to develop a monthly based model to compute the average income in five years. Also, conditions on the field as area, water availability, and initial investment are required. Using the Chilean costs and dollar-peso exchange rate, we can simulate several scenarios to understand the possible risks associated to this market. Also, this tool help to support decisions for government and individual farmers.Keywords: mixed integer problem, fresh fruit production, support decision model, agricultural and biosystems engineering
Procedia PDF Downloads 4386937 A Study on the Impact of Artificial Intelligence on Human Society and the Necessity for Setting up the Boundaries on AI Intrusion
Authors: Swarna Pundir, Prabuddha Hans
Abstract:
As AI has already stepped into the daily life of human society, one cannot be ignorant about the data it collects and used it to provide a quality of services depending up on the individuals’ choices. It also helps in giving option for making decision Vs choice selection with a calculation based on the history of our search criteria. Over the past decade or so, the way Artificial Intelligence (AI) has impacted society is undoubtedly large.AI has changed the way we shop, the way we entertain and challenge ourselves, the way information is handled, and has automated some sections of our life. We have answered as to what AI is, but not why one may see it as useful. AI is useful because it is capable of learning and predicting outcomes, using Machine Learning (ML) and Deep Learning (DL) with the help of Artificial Neural Networks (ANN). AI can also be a system that can act like humans. One of the major impacts be Joblessness through automation via AI which is seen mostly in manufacturing sectors, especially in the routine manual and blue-collar occupations and those without a college degree. It raises some serious concerns about AI in regards of less employment, ethics in making moral decisions, Individuals privacy, human judgement’s, natural emotions, biased decisions, discrimination. So, the question is if an error occurs who will be responsible, or it will be just waved off as a “Machine Error”, with no one taking the responsibility of any wrongdoing, it is essential to form some rules for using the AI where both machines and humans are involved. Procedia PDF Downloads 986936 Outcome of Comparison between Partial Thickness Skin Graft Harvesting from Scalp and Lower Limb for Scalp Defect: A Clinical Trial Study
Authors: Mahdi Eskandarlou, Mehrdad Taghipour
Abstract:
Background: Partial-thickness skin graft is the cornerstone for scalp defect repair. Routine donor sites include abdomen, thighs, and buttocks. Given the potential side effects following harvesting from these sites and the potential advantages of harvesting from scalp (broad surface, rapid healing, and better cosmetics results), this study is trying to compare the outcomes of graft harvesting from scalp and lower limb. Methods: This clinical trial is conducted among a sample number of 40 partial thickness graft candidates (20 case and 20 control group) with scalp defect presenting to plastic surgery clinic at Besat Hospital during the time period between 2018 and 2019. Sampling was done by simple randomization using random digit table. Data gathering was performed using a designated checklist. The donor site in case group and control group was scalp and lower limb, respectively. The resultant data were analyzed using chi-squared and t-test and SPPS version 21 (SPSS Statistics for Windows, Version 21.0. Armonk, NY: IBM Corp). Results: Of the total 40 patients participating in this study, 28 patients (70%) were male, and 12 (30%) were female with and mean age of 63.62 ± 09.73 years. Hypertension and diabetes mellitus were the most common comorbidities among patients with basal cell carcinoma (BCC) and trauma being the most common etiology for the defects. There was a statistically meaningful relationship between two groups regarding the etiology of defect (P=0.02). The most common anatomic location of defect for case and control groups was temporal and parietal, respectively. Most of the defects were deep to galea zone. The mean diameter of defect was 24.28 ± 45.37 mm for all of the patients. The difference between diameter of defect in both groups was statistically meaningful, while no such difference between graft diameter was seen. The graft 'Take' was completely successful in both groups according to evaluations. The level of postoperative pain was lower in the case group compared to the control according to VAS scale, and the satisfaction was higher in them per Likert scale. Conclusion: Scalp can safely be used as donor site for skin graft to be used for scalp defects, which is associated with better results and lower complication rates compared to other donor sites.Keywords: donor site, leg, partial-thickness graft, scalp
Procedia PDF Downloads 1506935 Combined Fuzzy and Predictive Controller for Unity Power Factor Converter
Authors: Abdelhalim Kessal
Abstract:
This paper treats a design of combined control of a single phase power factor correction (PFC). The strategy of the proposed control is based on two parts, the first, for the outer loop (DC output regulated voltage), and the second govern the input current of the converter in order to achieve a sinusoidal form in phase with the grid voltage. Two kinds of regulators are used, Fuzzy controller for the outer loop and predictive controller for the inner loop. The controllers are verified and discussed through simulation under MATLAB/Simulink platform. Also an experimental confirmation is applied. Results present a high dynamic performance under various parameters changes.Keywords: boost converter, harmonic distortion, Fuzzy, predictive, unity power factor
Procedia PDF Downloads 4926934 Thermal Radiation and Noise Safety Assessment of an Offshore Platform Flare Stack as Sudden Emergency Relief Takes Place
Authors: Lai Xuejiang, Huang Li, Yang Yi
Abstract:
To study the potential hazards of the sudden emergency relief of flare stack, the thermal radiation and noise calculation of flare stack is carried out by using Flaresim program 2.0. Thermal radiation and noise analysis should be considered as the sudden emergency relief takes place. According to the Flaresim software simulation results, the thermal radiation and noise meet the requirement.Keywords: flare stack, thermal radiation, safety assessment, noise
Procedia PDF Downloads 3556933 Modeling Reflection and Transmission of Elastodiffussive Wave Sata Semiconductor Interface
Authors: Amit Sharma, J. N. Sharma
Abstract:
This paper deals with the study of reflection and transmission characteristics of acoustic waves at the interface of a semiconductor halfspace and elastic solid. The amplitude ratios (reflection and transmission coefficients) of reflected and transmitted waves to that of incident wave varying with the incident angles have been examined for the case of quasi-longitudinal wave. The special cases of normal and grazing incidence have also been derived with the help of Gauss elimination method. The mathematical model consisting of governing partial differential equations of motion and charge carriers diffusion of n-type semiconductors and elastic solid has been solved both analytically and numerically in the study. The numerical computations of reflection and transmission coefficients has been carried out by using MATLAB programming software for silicon (Si) semiconductor and copper elastic solid. The computer simulated results have been plotted graphically for Si semiconductors. The study may be useful in semiconductors, geology, and seismology in addition to surface acoustic wave (SAW) devices.Keywords: quasilongitudinal, reflection and transmission, semiconductors, acoustics
Procedia PDF Downloads 3916932 Credit Risk Assessment Using Rule Based Classifiers: A Comparative Study
Authors: Salima Smiti, Ines Gasmi, Makram Soui
Abstract:
Credit risk is the most important issue for financial institutions. Its assessment becomes an important task used to predict defaulter customers and classify customers as good or bad payers. To this objective, numerous techniques have been applied for credit risk assessment. However, to our knowledge, several evaluation techniques are black-box models such as neural networks, SVM, etc. They generate applicants’ classes without any explanation. In this paper, we propose to assess credit risk using rules classification method. Our output is a set of rules which describe and explain the decision. To this end, we will compare seven classification algorithms (JRip, Decision Table, OneR, ZeroR, Fuzzy Rule, PART and Genetic programming (GP)) where the goal is to find the best rules satisfying many criteria: accuracy, sensitivity, and specificity. The obtained results confirm the efficiency of the GP algorithm for German and Australian datasets compared to other rule-based techniques to predict the credit risk.Keywords: credit risk assessment, classification algorithms, data mining, rule extraction
Procedia PDF Downloads 1816931 CT Images Based Dense Facial Soft Tissue Thickness Measurement by Open-source Tools in Chinese Population
Authors: Ye Xue, Zhenhua Deng
Abstract:
Objectives: Facial soft tissue thickness (FSTT) data could be obtained from CT scans by measuring the face-to-skull distances at sparsely distributed anatomical landmarks by manually located on face and skull. However, automated measurement using 3D facial and skull models by dense points using open-source software has become a viable option due to the development of computed assisted imaging technologies. By utilizing dense FSTT information, it becomes feasible to generate plausible automated facial approximations. Therefore, establishing a comprehensive and detailed, densely calculated FSTT database is crucial in enhancing the accuracy of facial approximation. Materials and methods: This study utilized head CT scans from 250 Chinese adults of Han ethnicity, with 170 participants originally born and residing in northern China and 80 participants in southern China. The age of the participants ranged from 14 to 82 years, and all samples were divided into five non-overlapping age groups. Additionally, samples were also divided into three categories based on BMI information. The 3D Slicer software was utilized to segment bone and soft tissue based on different Hounsfield Unit (HU) thresholds, and surface models of the face and skull were reconstructed for all samples from CT data. Following procedures were performed unsing MeshLab, including converting the face models into hollowed cropped surface models amd automatically measuring the Hausdorff Distance (referred to as FSTT) between the skull and face models. Hausdorff point clouds were colorized based on depth value and exported as PLY files. A histogram of the depth distributions could be view and subdivided into smaller increments. All PLY files were visualized of Hausdorff distance value of each vertex. Basic descriptive statistics (i.e., mean, maximum, minimum and standard deviation etc.) and distribution of FSTT were analysis considering the sex, age, BMI and birthplace. Statistical methods employed included Multiple Regression Analysis, ANOVA, principal component analysis (PCA). Results: The distribution of FSTT is mainly influenced by BMI and sex, as further supported by the results of the PCA analysis. Additionally, FSTT values exceeding 30mm were found to be more sensitive to sex. Birthplace-related differences were observed in regions such as the forehead, orbital, mandibular, and zygoma. Specifically, there are distribution variances in the depth range of 20-30mm, particularly in the mandibular region. Northern males exhibit thinner FSTT in the frontal region of the forehead compared to southern males, while females shows fewer distribution differences between the northern and southern, except for the zygoma region. The observed distribution variance in the orbital region could be attributed to differences in orbital size and shape. Discussion: This study provides a database of Chinese individuals distribution of FSTT and suggested opening source tool shows fine function for FSTT measurement. By incorporating birthplace as an influential factor in the distribution of FSTT, a greater level of detail can be achieved in facial approximation.Keywords: forensic anthropology, forensic imaging, cranial facial reconstruction, facial soft tissue thickness, CT, open-source tool
Procedia PDF Downloads 586930 Cultivating Individuality and Equality in Education: A Literature Review on Respecting Dimensions of Diversity within the Classroom
Authors: Melissa C. Ingram
Abstract:
This literature review sought to explore the dimensions of diversity that can affect classroom learning. This review is significant as it can aid educators in reaching more of their diverse student population and creating supportive classrooms for teachers and students. For this study, peer-reviewed articles were found and compiled using Google Scholar. Key terms used in the search include student individuality, classroom equality, student development, teacher development, and teacher individuality. Relevant educational standards such as Common Core and Partnership for the 21st Century were also included as part of this review. Student and teacher individuality and equality is discussed as well as methods to grow both within educational settings. Embracing student and teacher individuality was found to be key as it may affect how each person interacts with given information. One method to grow individuality and equality in educational settings included drafting and employing revised teaching standards which include various Common Core and U.S. State standards. Another was to use educational theories such as constructivism, cognitive learning, and Experiential Learning Theory. However, barriers to growing individuality, such as not acknowledging differences in a population’s dimensions of diversity, still exist. Studies found preserving the dimensions of diversity owned by both teachers and students yielded more positive and beneficial classroom experiences.Keywords: classroom equality, student development, student individuality, teacher development, teacher individuality
Procedia PDF Downloads 1886929 Electrical Cardiac Remodeling in Elite Athletes: A Comparative Study between Triathletes and Cyclists
Authors: Lingxia Li, Frédéric Schnell, Thibault Lachard, Anne-Charlotte Dupont, Shuzhe Ding, Solène Le Douairon Lahaye
Abstract:
Background: Repetitive participation in triathlon training results in significant myocardial changes. However, whether the cardiac remodeling in triathletes is related to the specificities of the sport (consisting of three sports) raises questions. Methods: Elite triathletes and cyclists registered on the French ministerial lists of high-level athletes were involved. The basic information and routine electrocardiogram records were obtained. Electrocardiograms were evaluated according to clinical criteria. Results: Of the 105 athletes included in the study, 42 were from the short-distance triathlon (40%), and 63 were from the road cycling (60%). The average age was 22.1±4.2 years. The P wave amplitude was significantly lower in triathletes than in cyclists (p=0.005), and no significant statistical difference was found in heart rate, RR interval, PR or PQ interval, QRS complex, QRS axe, QT interval, and QTc (p>0.05). All the measured parameters were within normal ranges. The most common electrical manifestations were early repolarization (60.95%) and incomplete right bundle branch block (43.81%); there was no statistical difference between the groups (p>0.05). Conclusions: Prolonged intensive endurance exercise training induces physiological cardiac remodeling in both triathletes and cyclists. The most common electrocardiogram manifestations were early repolarization and incomplete right bundle branch block.Keywords: cardiac screening, electrocardiogram, triathlon, cycling, elite athletes
Procedia PDF Downloads 76928 Using LMS as an E-Learning Platform in Higher Education
Authors: Mohammed Alhawiti
Abstract:
Assessment of Learning Management Systems has been of less importance than its due share. This paper investigates the evaluation of learning management systems (LMS) within educational setting as both an online learning system as well as a helpful tool for multidisciplinary learning environment. This study suggests a theoretical e-learning evaluation model, studying a multi-dimensional methods for evaluation through LMS system, service and content quality, learner`s perspective and attitudes of the instructor. A survey was conducted among 105 e-learners. The sample consisted of students at both undergraduate and master’s levels. Content validity, reliability were tested through the instrument, Findings suggested the suitability of the proposed model in evaluation for the satisfaction of learners through LMS. The results of this study would be valuable for both instructors and users of e-learning systems.Keywords: e-learning, LMS, higher education, management systems
Procedia PDF Downloads 4056927 An intelligent Troubleshooting System and Performance Evaluator for Computer Network
Authors: Iliya Musa Adamu
Abstract:
This paper seeks to develop an expert system that would troubleshoot computer network and evaluate the network system performance so as to reduce the workload on technicians and increase the efficiency and effectiveness of solutions proffered to computer network problems. The platform of the system was developed using ASP.NET, whereas the codes are implemented in Visual Basic and integrated with SQL Server 2005. The knowledge base was represented using production rule, whereas the searching method that was used in developing the network troubleshooting expert system is the forward-chaining-rule-based-system. This software tool offers the advantage of providing an immediate solution to most computer network problems encountered by computer users.Keywords: expert system, forward chaining rule based system, network, troubleshooting
Procedia PDF Downloads 6476926 Application of Transportation Linear Programming Algorithms to Cost Reduction in Nigeria Soft Drinks Industry
Authors: Salami Akeem Olanrewaju
Abstract:
The transportation models or problems are primarily concerned with the optimal (best possible) way in which a product produced at different factories or plants (called supply origins) can be transported to a number of warehouses or customers (called demand destinations). The objective in a transportation problem is to fully satisfy the destination requirements within the operating production capacity constraints at the minimum possible cost. The objective of this study is to determine ways of minimizing transport cost in order to maximum profit. Data were gathered from the records of the Distribution Department of 7-Up Bottling Company Plc. Ilorin, Kwara State, Nigeria. The data were analyzed using SPSS (Statistical Package for Social Sciences) while applying the three methods of solving a transportation problem. The three methods produced the same results; therefore, any of the method can be adopted by the company in transporting its final products to the wholesale dealers in order to minimize total production cost.Keywords: cost minimization, resources utilization, distribution system, allocation problem
Procedia PDF Downloads 2576925 Model of Production and Marketing Strategies in Alignment with Business Strategy using QFD Approach
Authors: Hamed Saremi, Suzan Taghavy, Shahla Saremi
Abstract:
In today's competitive world, organizations are expected to surpass the competitors and benefit from the resources and benefits. Therefore, organizations need to improve the current performance is felt more than ever that this requires to identify organizational optimal strategies, and consider all strategies simultaneously. In this study, to enhance competitive advantage and according to customer requirements, alignment between business, production and marketing strategies, House of Quality (QFD) approach has been used and zero-one linear programming model has been studied. First, the alignment between production and marketing strategies with business strategy, independent weights of these strategies is calculated. Then with using QFD approach the aligned weights of optimal strategies in each production and marketing field will be obtained and finally the aligned marketing strategies selection with the purpose of allocating budget and specialist human resource to marketing functions will be done that lead to increasing competitive advantage and benefit.Keywords: strategy alignment, house of quality deployment, production strategy, marketing strategy, business strategy
Procedia PDF Downloads 4356924 Simulink Library for Reference Current Generation in Active DC Traction Substations
Authors: Mihaela Popescu, Alexandru Bitoleanu
Abstract:
This paper is focused on the reference current calculation in the compensation mode of the active DC traction substations. The so-called p-q theory of the instantaneous reactive power is used as theoretical foundation. The compensation goal of total compensation is taken into consideration for the operation under both sinusoidal and nonsinusoidal voltage conditions, through the two objectives of unity power factor and perfect harmonic cancelation. Four blocks of reference current generation implement the conceived algorithms and they are included in a specific Simulink library, which is useful in a DSP dSPACE-based platform working under Matlab/Simulink. The simulation results validate the correctness of the implementation and fulfillment of the compensation tasks.Keywords: active power filter, DC traction, p-q theory, Simulink library
Procedia PDF Downloads 6746923 Miniaturizing the Volumetric Titration of Free Nitric Acid in U(vi) Solutions: On the Lookout for a More Sustainable Process Radioanalytical Chemistry through Titration-On-A-Chip
Authors: Jose Neri, Fabrice Canto, Alastair Magnaldo, Laurent Guillerme, Vincent Dugas
Abstract:
A miniaturized and automated approach for the volumetric titration of free nitric acid in U(VI) solutions is presented. Free acidity measurement refers to the acidity quantification in solutions containing hydrolysable heavy metal ions such as U(VI), U(IV) or Pu(IV) without taking into account the acidity contribution from the hydrolysis of such metal ions. It is, in fact, an operation having an essential role for the control of the nuclear fuel recycling process. The main objective behind the technical optimization of the actual ‘beaker’ method was to reduce the amount of radioactive substance to be handled by the laboratory personnel, to ease the instrumentation adjustability within a glove-box environment and to allow a high-throughput analysis for conducting more cost-effective operations. The measurement technique is based on the concept of the Taylor-Aris dispersion in order to create inside of a 200 μm x 5cm circular cylindrical micro-channel a linear concentration gradient in less than a second. The proposed analytical methodology relies on the actinide complexation using pH 5.6 sodium oxalate solution and subsequent alkalimetric titration of nitric acid with sodium hydroxide. The titration process is followed with a CCD camera for fluorescence detection; the neutralization boundary can be visualized in a detection range of 500nm- 600nm thanks to the addition of a pH sensitive fluorophore. The operating principle of the developed device allows the active generation of linear concentration gradients using a single cylindrical micro channel. This feature simplifies the fabrication and ease of use of the micro device, as it does not need a complex micro channel network or passive mixers to generate the chemical gradient. Moreover, since the linear gradient is determined by the liquid reagents input pressure, its generation can be fully achieved in faster intervals than one second, being a more timely-efficient gradient generation process compared to other source-sink passive diffusion devices. The resulting linear gradient generator device was therefore adapted to perform for the first time, a volumetric titration on a chip where the amount of reagents used is fixed to the total volume of the micro channel, avoiding an important waste generation like in other flow-based titration techniques. The associated analytical method is automated and its linearity has been proven for the free acidity determination of U(VI) samples containing up to 0.5M of actinide ion and nitric acid in a concentration range of 0.5M to 3M. In addition to automation, the developed analytical methodology and technique greatly improves the standard off-line oxalate complexation and alkalimetric titration method by reducing a thousand fold the required sample volume, forty times the nuclear waste per analysis as well as the analysis time by eight-fold. The developed device represents, therefore, a great step towards an easy-to-handle nuclear-related application, which in the short term could be used to improve laboratory safety as much as to reduce the environmental impact of the radioanalytical chain.Keywords: free acidity, lab-on-a-chip, linear concentration gradient, Taylor-Aris dispersion, volumetric titration
Procedia PDF Downloads 3876922 Volunteered Geographic Information Coupled with Wildfire Fire Progression Maps: A Spatial and Temporal Tool for Incident Storytelling
Authors: Cassandra Hansen, Paul Doherty, Chris Ferner, German Whitley, Holly Torpey
Abstract:
Wildfire is a natural and inevitable occurrence, yet changing climatic conditions have increased the severity, frequency, and risk to human populations in the wildland/urban interface (WUI) of the Western United States. Rapid dissemination of accurate wildfire information is critical to both the Incident Management Team (IMT) and the affected community. With the advent of increasingly sophisticated information systems, GIS can now be used as a web platform for sharing geographic information in new and innovative ways, such as virtual story map applications. Crowdsourced information can be extraordinarily useful when coupled with authoritative information. Information abounds in the form of social media, emergency alerts, radio, and news outlets, yet many of these resources lack a spatial component when first distributed. In this study, we describe how twenty-eight volunteer GIS professionals across nine Geographic Area Coordination Centers (GACC) sourced, curated, and distributed Volunteered Geographic Information (VGI) from authoritative social media accounts focused on disseminating information about wildfires and public safety. The combination of fire progression maps with VGI incident information helps answer three critical questions about an incident, such as: where the first started. How and why the fire behaved in an extreme manner and how we can learn from the fire incident's story to respond and prepare for future fires in this area. By adding a spatial component to that shared information, this team has been able to visualize shared information about wildfire starts in an interactive map that answers three critical questions in a more intuitive way. Additionally, long-term social and technical impacts on communities are examined in relation to situational awareness of the disaster through map layers and agency links, the number of views in a particular region of a disaster, community involvement and sharing of this critical resource. Combined with a GIS platform and disaster VGI applications, this workflow and information become invaluable to communities within the WUI and bring spatial awareness for disaster preparedness, response, mitigation, and recovery. This study highlights progression maps as the ultimate storytelling mechanism through incident case studies and demonstrates the impact of VGI and sophisticated applied cartographic methodology make this an indispensable resource for authoritative information sharing.Keywords: storytelling, wildfire progression maps, volunteered geographic information, spatial and temporal
Procedia PDF Downloads 1766921 The Museum of Museums: A Mobile Augmented Reality Application
Authors: Qian Jin
Abstract:
Museums have been using interactive technology to spark visitor interest and improve understanding. These technologies can play a crucial role in helping visitors understand more about an exhibition site by using multimedia to provide information. Google Arts and Culture and Smartify are two very successful digital heritage products. They used mobile augmented reality to visualise the museum's 3D models and heritage images but did not include 3D models of the collection and audio information. In this research, service-oriented mobile augmented reality application was developed for users to access collections from multiple museums(including V and A, the British Museum, and British Library). The third-party API (Application Programming Interface) is requested to collect metadata (including images, 3D models, videos, and text) of three museums' collections. The acquired content is then visualized in AR environments. This product will help users who cannot visit the museum offline due to various reasons (inconvenience of transportation, physical disability, time schedule).Keywords: digital heritage, argument reality, museum, flutter, ARcore
Procedia PDF Downloads 786920 Teaching Students Empathy: Justifying Diverse and Inclusive Texts
Authors: Jennifer Wallbrown
Abstract:
It’s not uncommon in the US to see news article headlines about public school teachers being scrutinized for what they are teaching or see the general public weighing in on whether or not they think certain controversial subjects should be addressed in the classroom- such as LGBTQ+ or multicultural literature. Even though this is a subject that has been written about and discussed for years, it continues to be a relevant topic in education as it continues to be a struggle to implement more diverse texts. Although it is valid for teachers to fear controversy when they attempt to create a more diverse or inclusive curriculum, it is a fight worth fighting because of the benefits students can gain from being exposed to a wide range of texts. This paper is different from others of its kind because it addresses many of the counterarguments often made to implementing LGBTQ+ or multicultural literature in secondary classrooms. It not only encourages educators to try to include more diverse texts, but it gives them the tools to address common concerns and be sound in their reasoning for choosing these texts. This can be of interest to those educators who are not English teachers because a truly diverse and inclusive curriculum would include other subjects as well- including history, art, and more. By the end of my proposed paper, readers will feel encouraged to choose more diverse and inclusive texts for their classrooms. They can also be confident that if met with opposition or controversy, as is sometimes common when implementing new texts, that they have sound arguments and reasoning for why they chose to include these texts. This reasoning is that, based on the research, studies have found there are benefits to students studying texts about those different from themselves, because it teaches them empathy and helps fight prejudice.Keywords: education, diverse, inclusive, multicultural, lgbtq+, pedagogy
Procedia PDF Downloads 1606919 Adaption Model for Building Agile Pronunciation Dictionaries Using Phonemic Distance Measurements
Authors: Akella Amarendra Babu, Rama Devi Yellasiri, Natukula Sainath
Abstract:
Where human beings can easily learn and adopt pronunciation variations, machines need training before put into use. Also humans keep minimum vocabulary and their pronunciation variations are stored in front-end of their memory for ready reference, while machines keep the entire pronunciation dictionary for ready reference. Supervised methods are used for preparation of pronunciation dictionaries which take large amounts of manual effort, cost, time and are not suitable for real time use. This paper presents an unsupervised adaptation model for building agile and dynamic pronunciation dictionaries online. These methods mimic human approach in learning the new pronunciations in real time. A new algorithm for measuring sound distances called Dynamic Phone Warping is presented and tested. Performance of the system is measured using an adaptation model and the precision metrics is found to be better than 86 percent.Keywords: pronunciation variations, dynamic programming, machine learning, natural language processing
Procedia PDF Downloads 1766918 Competence-Based Human Resources Selection and Training: Making Decisions
Authors: O. Starineca, I. Voronchuk
Abstract:
Human Resources (HR) selection and training have various implementation possibilities depending on an organization’s abilities and peculiarities. We propose to base HR selection and training decisions about on a competence-based approach. HR selection and training of employees are topical as there is room for improvement in this field; therefore, the aim of the research is to propose rational decision-making approaches for an organization HR selection and training choice. Our proposals are based on the training development and competence-based selection approaches created within previous researches i.e. Analytic-Hierarchy Process (AHP) and Linear Programming. Literature review on non-formal education, competence-based selection, AHP form our theoretical background. Some educational service providers in Latvia offer employees training, e.g. motivation, computer skills, accounting, law, ethics, stress management, etc. that are topical for Public Administration. Competence-based approach is a rational base for rational decision-making in both HR selection and considering HR training.Keywords: competence-based selection, human resource, training, decision-making
Procedia PDF Downloads 3376917 Urban Big Data: An Experimental Approach to Building-Value Estimation Using Web-Based Data
Authors: Sun-Young Jang, Sung-Ah Kim, Dongyoun Shin
Abstract:
Current real-estate value estimation, difficult for laymen, usually is performed by specialists. This paper presents an automated estimation process based on big data and machine-learning technology that calculates influences of building conditions on real-estate price measurement. The present study analyzed actual building sales sample data for Nonhyeon-dong, Gangnam-gu, Seoul, Korea, measuring the major influencing factors among the various building conditions. Further to that analysis, a prediction model was established and applied using RapidMiner Studio, a graphical user interface (GUI)-based tool for derivation of machine-learning prototypes. The prediction model is formulated by reference to previous examples. When new examples are applied, it analyses and predicts accordingly. The analysis process discerns the crucial factors effecting price increases by calculation of weighted values. The model was verified, and its accuracy determined, by comparing its predicted values with actual price increases.Keywords: apartment complex, big data, life-cycle building value analysis, machine learning
Procedia PDF Downloads 3746916 Multi-Criteria Decision Making Network Optimization for Green Supply Chains
Authors: Bandar A. Alkhayyal
Abstract:
Modern supply chains are typically linear, transforming virgin raw materials into products for end consumers, who then discard them after use to landfills or incinerators. Nowadays, there are major efforts underway to create a circular economy to reduce non-renewable resource use and waste. One important aspect of these efforts is the development of Green Supply Chain (GSC) systems which enables a reverse flow of used products from consumers back to manufacturers, where they can be refurbished or remanufactured, to both economic and environmental benefit. This paper develops novel multi-objective optimization models to inform GSC system design at multiple levels: (1) strategic planning of facility location and transportation logistics; (2) tactical planning of optimal pricing; and (3) policy planning to account for potential valuation of GSC emissions. First, physical linear programming was applied to evaluate GSC facility placement by determining the quantities of end-of-life products for transport from candidate collection centers to remanufacturing facilities while satisfying cost and capacity criteria. Second, disassembly and remanufacturing processes have received little attention in industrial engineering and process cost modeling literature. The increasing scale of remanufacturing operations, worth nearly $50 billion annually in the United States alone, have made GSC pricing an important subject of research. A non-linear physical programming model for optimization of pricing policy for remanufactured products that maximizes total profit and minimizes product recovery costs were examined and solved. Finally, a deterministic equilibrium model was used to determine the effects of internalizing a cost of GSC greenhouse gas (GHG) emissions into optimization models. Changes in optimal facility use, transportation logistics, and pricing/profit margins were all investigated against a variable cost of carbon, using case study system created based on actual data from sites in the Boston area. As carbon costs increase, the optimal GSC system undergoes several distinct shifts in topology as it seeks new cost-minimal configurations. A comprehensive study of quantitative evaluation and performance of the model has been done using orthogonal arrays. Results were compared to top-down estimates from economic input-output life cycle assessment (EIO-LCA) models, to contrast remanufacturing GHG emission quantities with those from original equipment manufacturing operations. Introducing a carbon cost of $40/t CO2e increases modeled remanufacturing costs by 2.7% but also increases original equipment costs by 2.3%. The assembled work advances the theoretical modeling of optimal GSC systems and presents a rare case study of remanufactured appliances.Keywords: circular economy, extended producer responsibility, greenhouse gas emissions, industrial ecology, low carbon logistics, green supply chains
Procedia PDF Downloads 1606915 Contextual Sentiment Analysis with Untrained Annotators
Authors: Lucas A. Silva, Carla R. Aguiar
Abstract:
This work presents a proposal to perform contextual sentiment analysis using a supervised learning algorithm and disregarding the extensive training of annotators. To achieve this goal, a web platform was developed to perform the entire procedure outlined in this paper. The main contribution of the pipeline described in this article is to simplify and automate the annotation process through a system of analysis of congruence between the notes. This ensured satisfactory results even without using specialized annotators in the context of the research, avoiding the generation of biased training data for the classifiers. For this, a case study was conducted in a blog of entrepreneurship. The experimental results were consistent with the literature related annotation using formalized process with experts.Keywords: sentiment analysis, untrained annotators, naive bayes, entrepreneurship, contextualized classifier
Procedia PDF Downloads 3966914 Design an Development of an Agorithm for Prioritizing the Test Cases Using Neural Network as Classifier
Authors: Amit Verma, Simranjeet Kaur, Sandeep Kaur
Abstract:
Test Case Prioritization (TCP) has gained wide spread acceptance as it often results in good quality software free from defects. Due to the increase in rate of faults in software traditional techniques for prioritization results in increased cost and time. Main challenge in TCP is difficulty in manually validate the priorities of different test cases due to large size of test suites and no more emphasis are made to make the TCP process automate. The objective of this paper is to detect the priorities of different test cases using an artificial neural network which helps to predict the correct priorities with the help of back propagation algorithm. In our proposed work one such method is implemented in which priorities are assigned to different test cases based on their frequency. After assigning the priorities ANN predicts whether correct priority is assigned to every test case or not otherwise it generates the interrupt when wrong priority is assigned. In order to classify the different priority test cases classifiers are used. Proposed algorithm is very effective as it reduces the complexity with robust efficiency and makes the process automated to prioritize the test cases.Keywords: test case prioritization, classification, artificial neural networks, TF-IDF
Procedia PDF Downloads 3976913 A Study of Effect of Yoga on Choice Visual Reaction Time of Soccer Players
Authors: Vikram Singh, Parmod Kumar Sethi
Abstract:
The objective of the study was to study the effectiveness of common yoga protocol on reaction time (choice visual reaction time, measured in milliseconds/seconds) of male football players in the age group of 16 to 21 years. The 40 boys were measured initially on parameters of years of experience, level of participation. They were randomly assigned into two groups i.e. control and experimental. CVRT for both the groups was measured on day-1 and post intervention (common yoga protocol here) was measured after 45 days of training to the experimental group after they had finished with their regular fitness and soccer skill training. One way ANOVA (Univariate analysis) and Independent t-test using SPSS 23 statistical package were applied to get and analyze the results. The experimental yoga protocol group showed a significant reduction in CVRT, whereas the insignificant difference in reaction times was observed for control group after 45 days. The effect size was more than 52% for CVRT indicating that the effect of treatment was large. Power of the study was also found to be high (> .80). There was a significant difference after 45 days of yoga protocol in choice visual reaction time of experimental group (p = .000), t (21.93) = 6.410, p = .000 (two-tailed). The null hypothesis (that there would be no difference in reaction times of control and experimental groups) was rejected. Where p< .05. Therefore alternate hypothesis was accepted.Keywords: reaction time, yoga protocol, t-test, soccer players
Procedia PDF Downloads 236