Search results for: Data quality
26942 Acute Respiratory Distress Syndrome (ARDS) Developed Clinical Pathway: Suggested Protocol
Authors: Maha Salah, Hanaa Hashem, Mahmoud M. Alsagheir, Mohammed Salah
Abstract:
Acute respiratory distress syndrome (ARDS) represents a complex clinical syndrome and carries a high risk for mortality. The severity of the clinical course, the uncertainty of the outcome, and the reliance on the full spectrum of critical care resources for treatment mean that the entire health care team is challenged. Researchers and clinicians have investigated the nature of the pathological process and explored treatment options with the goal of improving outcome. Through this application of research to practice, we know that some previous strategies have been ineffective, and innovations in mechanical ventilation, sedation, nutrition, and pharmacological intervention remain important research initiatives. Developed Clinical pathway is multidisciplinary plans of best clinical practice for this specified groups of patients that aid in the coordination and delivery of high quality care. They are a documented sequence of clinical interventions that help a patient to move, progressively through a clinical experience to a desired outcome. Although there is a lot of heterogeneity in patients with ARDS, this suggested developed clinical pathway with alternatives was built depended on a lot of researches and evidence based medicine and nursing practices which may be helping these patients to improve outcomes, quality of life and decrease mortality.Keywords: acute respiratory distress syndrome (ARDS), clinical pathway, clinical syndrome
Procedia PDF Downloads 53426941 Determinants of Integrated Reporting in Nigeria
Authors: Uwalomwa Uwuigbe, Olubukola Ranti Uwuigbe, Jinadu Olugbenga, Otekunrin Adegbola
Abstract:
Corporate reporting has evolved over the years resulting from criticisms of the precedent by shareholders, stakeholders and other relevant financial institutions. Integrated reporting has become a globalized corporate reporting style, with its adoption around the world occurring rapidly to bring about an improvement in the quality of corporate reporting. While some countries have swiftly clinched into reporting in an integrated manner, others have not. In addition, there are ample research that has been conducted on the benefits of adopting integrated reporting, however, the same is not true in developing economies like Nigeria. Hence, this study basically examined the factors determining the adoption of integrated reporting in Nigeria. One hundred (100) copies of questionnaire was administered to financial managers of 20 selected listed companies in the Nigeria stock exchange market. The data obtained was analysed using the Spearman Rank Order Correlation via the Statistical Package for Social Science. This study observed that there is a significant relationship between the social pressures of isomorphic changes and integrated reporting adoption in Nigeria. The study recommends the need for an enforcement mechanism to be put in place while considering the adoption of integrated reporting in Nigeria, enforcement mechanisms should put into consideration the investors demand, the level of economic development, and the degree of corporate social responsibility.Keywords: corporate social responsibility, isomorphic, integrated reporting, Nigeria, sustainability
Procedia PDF Downloads 39026940 Computer Assisted Strategies Help to Pharmacist
Authors: Komal Fizza
Abstract:
All around the world in every field professionals are taking great support from their computers. Computer assisted strategies not only increase the efficiency of the professionals but also in case of healthcare they help in life-saving interventions. The background of this current research is aimed towards two things; first to find out if computer assisted strategies are useful for Pharmacist for not and secondly how much these assist a Pharmacist to do quality interventions. Shifa International Hospital is a 500 bedded hospital, and it is running Antimicrobial Stewardship, during their stewardship rounds pharmacists observed that a lot of wrong doses of antibiotics were coming at times those were being overlooked by the other pharmacist even. So, with the help of MIS team the patients were categorized into adult and peads depending upon their age. Minimum and maximum dose of every single antibiotic present in the pharmacy that could be dispensed to the patient was developed. These were linked to the order entry window. So whenever pharmacist would type any order and the dose would be below or above the therapeutic limit this would give an alert to the pharmacist. Whenever this message pop-up this was recorded at the back end along with the antibiotic name, pharmacist ID, date, and time. From 14th of January 2015 and till 14th of March 2015 the software stopped different users 350 times. Out of this 300 were found to be major errors which if reached to the patient could have harmed them to the greater extent. While 50 were due to typing errors and minor deviations. The pilot study showed that computer assisted strategies can be of great help to the pharmacist. They can improve the efficacy and quality of interventions.Keywords: antibiotics, computer assisted strategies, pharmacist, stewardship
Procedia PDF Downloads 49126939 Cassava Plant Architecture: Insights from Genome-Wide Association Studies
Authors: Abiodun Olayinka, Daniel Dzidzienyo, Pangirayi Tongoona, Samuel Offei, Edwige Gaby Nkouaya Mbanjo, Chiedozie Egesi, Ismail Yusuf Rabbi
Abstract:
Cassava (Manihot esculenta Crantz) is a major source of starch for various industrial applications. However, the traditional cultivation and harvesting methods of cassava are labour-intensive and inefficient, limiting the supply of fresh cassava roots for industrial starch production. To achieve improved productivity and quality of fresh cassava roots through mechanized cultivation, cassava cultivars with compact plant architecture and moderate plant height are needed. Plant architecture-related traits, such as plant height, harvest index, stem diameter, branching angle, and lodging tolerance, are critical for crop productivity and suitability for mechanized cultivation. However, the genetics of cassava plant architecture remain poorly understood. This study aimed to identify the genetic bases of the relationships between plant architecture traits and productivity-related traits, particularly starch content. A panel of 453 clones developed at the International Institute of Tropical Agriculture, Nigeria, was genotyped and phenotyped for 18 plant architecture and productivity-related traits at four locations in Nigeria. A genome-wide association study (GWAS) was conducted using the phenotypic data from a panel of 453 clones and 61,238 high-quality Diversity Arrays Technology sequencing (DArTseq) derived Single Nucleotide Polymorphism (SNP) markers that are evenly distributed across the cassava genome. Five significant associations between ten SNPs and three plant architecture component traits were identified through GWAS. We found five SNPs on chromosomes 6 and 16 that were significantly associated with shoot weight, harvest index, and total yield through genome-wide association mapping. We also discovered an essential candidate gene that is co-located with peak SNPs linked to these traits in M. esculenta. A review of the cassava reference genome v7.1 revealed that the SNP on chromosome 6 is in proximity to Manes.06G101600.1, a gene that regulates endodermal differentiation and root development in plants. The findings of this study provide insights into the genetic basis of plant architecture and yield in cassava. Cassava breeders could leverage this knowledge to optimize plant architecture and yield in cassava through marker-assisted selection and targeted manipulation of the candidate gene.Keywords: Manihot esculenta Crantz, plant architecture, DArtseq, SNP markers, genome-wide association study
Procedia PDF Downloads 7026938 An Efficient Data Mining Technique for Online Stores
Authors: Mohammed Al-Shalabi, Alaa Obeidat
Abstract:
In any food stores, some items will be expired or destroyed because the demand on these items is infrequent, so we need a system that can help the decision maker to make an offer on such items to improve the demand on the items by putting them with some other frequent item and decrease the price to avoid losses. The system generates hundreds or thousands of patterns (offers) for each low demand item, then it uses the association rules (support, confidence) to find the interesting patterns (the best offer to achieve the lowest losses). In this paper, we propose a data mining method for determining the best offer by merging the data mining techniques with the e-commerce strategy. The task is to build a model to predict the best offer. The goal is to maximize the profits of a store and avoid the loss of products. The idea in this paper is the using of the association rules in marketing with a combination with e-commerce.Keywords: data mining, association rules, confidence, online stores
Procedia PDF Downloads 41026937 Elemental Graph Data Model: A Semantic and Topological Representation of Building Elements
Authors: Yasmeen A. S. Essawy, Khaled Nassar
Abstract:
With the rapid increase of complexity in the building industry, professionals in the A/E/C industry were forced to adopt Building Information Modeling (BIM) in order to enhance the communication between the different project stakeholders throughout the project life cycle and create a semantic object-oriented building model that can support geometric-topological analysis of building elements during design and construction. This paper presents a model that extracts topological relationships and geometrical properties of building elements from an existing fully designed BIM, and maps this information into a directed acyclic Elemental Graph Data Model (EGDM). The model incorporates BIM-based search algorithms for automatic deduction of geometrical data and topological relationships for each building element type. Using graph search algorithms, such as Depth First Search (DFS) and topological sortings, all possible construction sequences can be generated and compared against production and construction rules to generate an optimized construction sequence and its associated schedule. The model is implemented in a C# platform.Keywords: building information modeling (BIM), elemental graph data model (EGDM), geometric and topological data models, graph theory
Procedia PDF Downloads 38226936 The Coexistence of Creativity and Information in Convergence Journalism: Pakistan's Evolving Media Landscape
Authors: Misha Mirza
Abstract:
In recent years, the definition of journalism in Pakistan has changed, so has the mindset of people and their approach towards a news story. For the audience, news has become more interesting than a drama or a film. This research thus provides an insight into Pakistan’s evolving media landscape. It tries not only to bring forth the outcomes of cross-platform cooperation among print and broadcast journalism but also gives an insight into the interactive data visualization techniques being used. The storytelling in journalism in Pakistan has evolved from depicting merely the truth to tweaking, fabricating and producing docu-dramas. It aims to look into how news is translated to a visual. Pakistan acquires a diverse cultural heritage and by engaging audience through media, this history translates into the storytelling platform today. The paper explains how journalists are thriving in a converging media environment and provides an analysis of the narratives in television talk shows today.’ Jack of all, master of none’ is being challenged by the journalists today. One has to be a quality information gatherer and an effective storyteller at the same time. Are journalists really looking more into what sells rather than what matters? Express Tribune is a very popular news platform among the youth. Not only is their newspaper more attractive than the competitors but also their style of narrative and interactive web stories lead to well-rounded news. Interviews are used as the basic methodology to get an insight into how data visualization is compassed. The quest for finding out the difference between visualization of information versus the visualization of knowledge has led the author to delve into the work of David McCandless in his book ‘Knowledge is beautiful’. Journalism in Pakistan has evolved from information to combining knowledge, infotainment and comedy. What is being criticized the most by the society most often becomes the breaking news. Circulation in today’s world is carried out in cultural and social networks. In recent times, we have come across many examples where people have gained overnight popularity by releasing songs with substandard lyrics or senseless videos perhaps because creativity has taken over information. This paper thus discusses the various platforms of convergence journalism from Pakistan’s perspective. The study concludes with proving how Pakistani pop culture Truck art is coexisting with all the platforms in convergent journalism. The changing media landscape thus challenges the basic rules of journalism. The slapstick humor and ‘jhatka’ in Pakistani talk shows has evolved from the Pakistani truck art poetry. Mobile journalism has taken over all the other mediums of journalism; however, the Pakistani culture coexists with the converging landscape.Keywords: convergence journalism in Pakistan, data visualization, interactive narrative in Pakistani news, mobile journalism, Pakistan's truck art culture
Procedia PDF Downloads 28426935 Assessment of Post-surgical Donor-Site Morbidity in Vastus lateralis Free Flap for Head and Neck Reconstructive Surgery: An Observational Study
Authors: Ishith Seth, Lyndel Hewitt, Takako Yabe, James Wykes, Jonathan Clark, Bruce Ashford
Abstract:
Background: Vastus lateralis (VL) can be used to reconstruct defects of the head and neck. Whilst the advantages are documented, donor-site morbidity is not well described. This study aimed to assess donor-site morbidity after VL flap harvest. The results will determine future directions for preventative and post-operative care to improve patient health outcomes. Methods: Ten participants (mean age 55 years) were assessed for the presence of donor-site morbidity after VL harvest. Musculoskeletal (pain, muscle strength, muscle length, tactile sensation), quality of life (SF-12), and lower limb function (lower extremity function, gait (function and speed), sit to stand were assessed using validated and standardized procedures. Outcomes were compared to age-matched healthy reference values or the non-operative side. Analyses were conducted using descriptive statistics and non-parametric tests. Results: There was no difference in muscle strength (knee extension), muscle length, ability to sit-to-stand, or gait function (all P > 0.05). Knee flexor muscle strength was significantly less on the operated leg compared to the non-operated leg (P=0.02) and walking speed was slower than age-matched healthy values (P<0.001). Thigh tactile sensation was impaired in 89% of participants. Quality of life was significantly less for the physical health component of the SF-12 (P<0.001). The mental health component of the SF-12 was similar to healthy controls (P=0.26). Conclusion: There was no effect on donor site morbidity with regards to knee extensor strength, pain, walking function, ability to sit-to-stand, and muscle length. VL harvest affected donor-site knee flexion strength, walking speed, tactile sensation, and physical health-related quality of life.Keywords: vastus lateralis, morbidity, head and neck, surgery, donor-site morbidity
Procedia PDF Downloads 24226934 Evaluation of the Incorporation of Modified Starch in Puff Pastry Dough by Mixolab Rheological Analysis
Authors: Alejandra Castillo-Arias, Carlos A. Fuenmayor, Carlos M. Zuluaga-Domínguez
Abstract:
The connection between health and nutrition has driven the food industry to explore healthier and more sustainable alternatives. Key strategies to enhance nutritional quality and extend shelf life include reducing saturated fats and incorporating natural ingredients. One area of focus is the use of modified starch in baked goods, which has attracted significant interest in food science and industry due to its functional benefits. Modified starches are commonly used for their gelling, thickening, and water-retention properties. Derived from sources like waxy corn, potatoes, tapioca, or rice, these polysaccharides improve thermal stability and resistance to dough. The use of modified starch enhances the texture and structure of baked goods, which is crucial for consumer acceptance. In this study, it was evaluated the effects of modified starch inclusion on dough used for puff pastry elaboration, measured with Mixolab analysis. This technique assesses flour quality by examining its behavior under varying conditions, providing a comprehensive profile of its baking properties. The analysis included measurements of water absorption capacity, dough development time, dough stability, softening, final consistency, and starch gelatinization. Each of these parameters offers insights into how the flour will perform during baking and the quality of the final product. The performance of wheat flour with varying levels of modified starch inclusion (10%, 20%, 30%, and 40%) was evaluated through Mixolab analysis, with a control sample consisting of 100% wheat flour. Water absorption, gluten content, and retrogradation indices were analyzed to understand how modified starch affects dough properties. The results showed that the inclusion of modified starch increased the absorption index, especially at levels above 30%, indicating a dough with better handling qualities and potentially improved texture in the final baked product. However, the reduction in wheat flour resulted in a lower kneading index, affecting dough strength. Conversely, incorporating more than 20% modified starch reduced the retrogradation index, indicating improved stability and resistance to crystallization after cooling. Additionally, the modified starch improved the gluten index, contributing to better dough elasticity and stability, providing good structural support and resistance to deformation during mixing and baking. As expected, the control sample exhibited a higher amylase index, due to the presence of enzymes in wheat flour. However, this is of low concern in puff pastry dough, as amylase activity is more relevant in fermented doughs, which is not the case here. Overall, the use of modified starch in puff pastry enhanced product quality by improving texture, structure, and shelf life, particularly when used at levels between 30% and 40%. This research underscores the potential of modified starches to address health concerns associated with traditional starches and to contribute to the development of higher-quality, consumer-friendly baked products. Furthermore, the findings suggest that modified starches could play a pivotal role in future innovations within the baking industry, particularly in products aiming to balance healthfulness with sensory appeal. By incorporating modified starch into their formulations, bakeries can meet the growing demand for healthier, more sustainable products while maintaining the indulgent qualities that consumers expect from baked goods.Keywords: baking quality, dough properties, modified starch, puff pastry
Procedia PDF Downloads 2426933 Wireless Sensor Network for Forest Fire Detection and Localization
Authors: Tarek Dandashi
Abstract:
WSNs may provide a fast and reliable solution for the early detection of environment events like forest fires. This is crucial for alerting and calling for fire brigade intervention. Sensor nodes communicate sensor data to a host station, which enables a global analysis and the generation of a reliable decision on a potential fire and its location. A WSN with TinyOS and nesC for the capturing and transmission of a variety of sensor information with controlled source, data rates, duration, and the records/displaying activity traces is presented. We propose a similarity distance (SD) between the distribution of currently sensed data and that of a reference. At any given time, a fire causes diverging opinions in the reported data, which alters the usual data distribution. Basically, SD consists of a metric on the Cumulative Distribution Function (CDF). SD is designed to be invariant versus day-to-day changes of temperature, changes due to the surrounding environment, and normal changes in weather, which preserve the data locality. Evaluation shows that SD sensitivity is quadratic versus an increase in sensor node temperature for a group of sensors of different sizes and neighborhood. Simulation of fire spreading when ignition is placed at random locations with some wind speed shows that SD takes a few minutes to reliably detect fires and locate them. We also discuss the case of false negative and false positive and their impact on the decision reliability.Keywords: forest fire, WSN, wireless sensor network, algortihm
Procedia PDF Downloads 26226932 Mixtures of Length-Biased Weibull Distributions for Loss Severity Modelling
Authors: Taehan Bae
Abstract:
In this paper, a class of length-biased Weibull mixtures is presented to model loss severity data. The proposed model generalizes the Erlang mixtures with the common scale parameter, and it shares many important modelling features, such as flexibility to fit various data distribution shapes and weak-denseness in the class of positive continuous distributions, with the Erlang mixtures. We show that the asymptotic tail estimate of the length-biased Weibull mixture is Weibull-type, which makes the model effective to fit loss severity data with heavy-tailed observations. A method of statistical estimation is discussed with applications on real catastrophic loss data sets.Keywords: Erlang mixture, length-biased distribution, transformed gamma distribution, asymptotic tail estimate, EM algorithm, expectation-maximization algorithm
Procedia PDF Downloads 22426931 Robust Data Image Watermarking for Data Security
Authors: Harsh Vikram Singh, Ankur Rai, Anand Mohan
Abstract:
In this paper, we propose secure and robust data hiding algorithm based on DCT by Arnold transform and chaotic sequence. The watermark image is scrambled by Arnold cat map to increases its security and then the chaotic map is used for watermark signal spread in middle band of DCT coefficients of the cover image The chaotic map can be used as pseudo-random generator for digital data hiding, to increase security and robustness .Performance evaluation for robustness and imperceptibility of proposed algorithm has been made using bit error rate (BER), normalized correlation (NC), and peak signal to noise ratio (PSNR) value for different watermark and cover images such as Lena, Girl, Tank images and gain factor .We use a binary logo image and text image as watermark. The experimental results demonstrate that the proposed algorithm achieves higher security and robustness against JPEG compression as well as other attacks such as addition of noise, low pass filtering and cropping attacks compared to other existing algorithm using DCT coefficients. Moreover, to recover watermarks in proposed algorithm, there is no need to original cover image.Keywords: data hiding, watermarking, DCT, chaotic sequence, arnold transforms
Procedia PDF Downloads 51526930 Simulation: A Tool for Stabilization of Welding Processes in Lean Production Concepts
Authors: Ola Jon Mork, Lars Andre Giske, Emil Bjørlykhaug
Abstract:
Stabilization of critical processes in order to have the right quality of the products, more efficient production and smoother flow is a key issue in lean production. This paper presents how simulation of key welding processes can stabilize complicated welding processes in small scale production, and how simulation can impact the entire production concept seen from the perspective of lean production. First, a field study was made to learn the production processes in the factory, and subsequently the field study was transformed into a value stream map to get insight into each operation, the quality issues, operation times, lead times and flow of materials. Valuable practical knowledge of how the welding operations were done by operators, appropriate tools and jigs, and type of robots that could be used, was collected. All available information was then implemented into a simulation environment for further elaboration and development. Three researchers, the management of the company and skilled operators at the work floor where working on the project over a period of eight months, and a detailed description of the process was made by the researchers. The simulation showed that simulation could solve a number of technical challenges, the robot program can be tuned in off line mode, and the design and testing of the robot cell could be made in the simulator. Further on the design of the product could be optimized for robot welding and the jigs could be designed and tested in simulation environment. This means that a key issue of lean production can be solved; the welding operation will work with almost 100% performance when it is put into real production. Stabilizing of one key process is critical to gain control of the entire value chain, then a Takt Time can be established and the focus can be directed towards the next process in the production which should be stabilized. Results show that industrial parameters like welding time, welding cost and welding quality can be defined on the simulation stage. Further on, this gives valuable information for calculation of the factories business performance, like manufacturing volume and manufacturing efficiency. Industrial impact from simulation is more efficient implementation of lean manufacturing, since the welding process can be stabilized. More research should be done to gain more knowledge about simulation as a tool for implementation of lean, especially where there complex processes.Keywords: simulation, lean, stabilization, welding process
Procedia PDF Downloads 32126929 An Empirical Investigation of Big Data Analytics: The Financial Performance of Users versus Vendors
Authors: Evisa Mitrou, Nicholas Tsitsianis, Supriya Shinde
Abstract:
In the age of digitisation and globalisation, businesses have shifted online and are investing in big data analytics (BDA) to respond to changing market conditions and sustain their performance. Our study shifts the focus from the adoption of BDA to the impact of BDA on financial performance. We explore the financial performance of both BDA-vendors (business-to-business) and BDA-clients (business-to-customer). We distinguish between the five BDA-technologies (big-data-as-a-service (BDaaS), descriptive, diagnostic, predictive, and prescriptive analytics) and discuss them individually. Further, we use four perspectives (internal business process, learning and growth, customer, and finance) and discuss the significance of how each of the five BDA-technologies affects the performance measures of these four perspectives. We also present the analysis of employee engagement, average turnover, average net income, and average net assets for BDA-clients and BDA-vendors. Our study also explores the effect of the COVID-19 pandemic on business continuity for both BDA-vendors and BDA-clients.Keywords: BDA-clients, BDA-vendors, big data analytics, financial performance
Procedia PDF Downloads 12426928 Understanding Water Governance in the Central Rift Valley of Ethiopia: Zooming into Transparency, Accountability, and Participation
Authors: Endalew Jibat, Feyera Senbeta, Tesfaye Zeleke, Fitsum Hagos
Abstract:
Water governance considers multi-sector participation beyond the state; and for sustainable use of water resources, appropriate laws, policies, regulations, and institutions needs to be developed and put in place. Water policy, a critical and integral instrument of water governance, guided water use schemes and ensures equitable water distribution among users. The Ethiopian Central Rift Valley (CRV) is wealthy of water resources, but these water resources are currently under severe strain owing to an imbalance in human-water interactions. The main aim of the study was to examine the state of water resources governance in the CRV of Ethiopia, and the impact of the Ethiopian Water Resources Management Policy on water governance. Key informant interviews (KII), focused group discussions, and document reviews were used to gather data for the study. The NVivo 11 program was used to organize, code, and analyze the data. The results revealed that water resources governance practices such as water allocation and apportionment, comprehensive and integrated water management plans, water resources protection, and conservation activities were rarely implemented. Water resources management policy mechanisms were not fully put in place. Lack of coherence in water policy implementation, absence of clear roles and responsibilities of stakeholders, absence of transparency and accountability in irrigation water service delivery, and lack of meaningful participation of key actors in water governance decision-making were the primary shortcomings observed. Factors such as over-abstraction, deterioration of buffer zone, and chemical erosion from surrounding farming have contributed to the reduction in water volume and quality in the CRV. These challenges have influenced aquatic ecosystem services and threaten the livelihoods of the surrounding communities. Hence, reforms relating to policy coherence and enforcement, stakeholder involvement, water distribution strategies, and the application of water governance principles must be given more emphasis.Keywords: water resources, irrigation, governance, water allocation, governance principles, stakeholders engagement, central rift valley
Procedia PDF Downloads 9226927 Educating Children Who Are Deaf and Hearing Impaired in Southern Africa: Challenges and Triumphs
Authors: Emma Louise McKinney
Abstract:
There is a global move to integrate children who are Deaf and Hearing Impaired into regular classrooms with their hearing peers with an inclusive education framework. This paper examines the current education situation for children who are Deaf and Hearing Impaired in South Africa, Madagascar, Malawi, Zimbabwe, and Namibia. Qualitative data for this paper was obtained from the author’s experiences working as the Southern African Education Advisor for an international organization funding disability projects. It examines some of the challenges facing these children and their teachers relating to education. Challenges include cultural stigma relating to disability and deafness, a lack of hearing screening and early identification of deafness, schools in rural areas, special schools, specialist teacher training, equipment, understanding of how to implement policy, support, appropriate teaching methodologies, and sign language training and proficiency. On the other hand, in spite of the challenges some teachers are able to provide quality education to children who are Deaf and Hearing Impaired. This paper examines both the challenges as well as what teachers are doing to overcome these.Keywords: education of children who are deaf and hearing impaired, Southern African experiences, challenges, triumphs
Procedia PDF Downloads 24026926 Rapid Monitoring of Earthquake Damages Using Optical and SAR Data
Authors: Saeid Gharechelou, Ryutaro Tateishi
Abstract:
Earthquake is an inevitable catastrophic natural disaster. The damages of buildings and man-made structures, where most of the human activities occur are the major cause of casualties from earthquakes. A comparison of optical and SAR data is presented in the case of Kathmandu valley which was hardly shaken by 2015-Nepal Earthquake. Though many existing researchers have conducted optical data based estimated or suggested combined use of optical and SAR data for improved accuracy, however finding cloud-free optical images when urgently needed are not assured. Therefore, this research is specializd in developing SAR based technique with the target of rapid and accurate geospatial reporting. Should considers that limited time available in post-disaster situation offering quick computation exclusively based on two pairs of pre-seismic and co-seismic single look complex (SLC) images. The InSAR coherence pre-seismic, co-seismic and post-seismic was used to detect the change in damaged area. In addition, the ground truth data from field applied to optical data by random forest classification for detection of damaged area. The ground truth data collected in the field were used to assess the accuracy of supervised classification approach. Though a higher accuracy obtained from the optical data then integration by optical-SAR data. Limitation of cloud-free images when urgently needed for earthquak evevent are and is not assured, thus further research on improving the SAR based damage detection is suggested. Availability of very accurate damage information is expected for channelling the rescue and emergency operations. It is expected that the quick reporting of the post-disaster damage situation quantified by the rapid earthquake assessment should assist in channeling the rescue and emergency operations, and in informing the public about the scale of damage.Keywords: Sentinel-1A data, Landsat-8, earthquake damage, InSAR, rapid damage monitoring, 2015-Nepal earthquake
Procedia PDF Downloads 17226925 Scheduling Nodes Activity and Data Communication for Target Tracking in Wireless Sensor Networks
Authors: AmirHossein Mohajerzadeh, Mohammad Alishahi, Saeed Aslishahi, Mohsen Zabihi
Abstract:
In this paper, we consider sensor nodes with the capability of measuring the bearings (relative angle to the target). We use geometric methods to select a set of observer nodes which are responsible for collecting data from the target. Considering the characteristics of target tracking applications, it is clear that significant numbers of sensor nodes are usually inactive. Therefore, in order to minimize the total network energy consumption, a set of sensor nodes, called sentinel, is periodically selected for monitoring, controlling the environment and transmitting data through the network. The other nodes are inactive. Furthermore, the proposed algorithm provides a joint scheduling and routing algorithm to transmit data between network nodes and the fusion center (FC) in which not only provides an efficient way to estimate the target position but also provides an efficient target tracking. Performance evaluation confirms the superiority of the proposed algorithm.Keywords: coverage, routing, scheduling, target tracking, wireless sensor networks
Procedia PDF Downloads 37826924 Process Mining as an Ecosystem Platform to Mitigate a Deficiency of Processes Modelling
Authors: Yusra Abdulsalam Alqamati, Ahmed Alkilany
Abstract:
The teaching staff is a distinct group whose impact is on the educational process and which plays an important role in enhancing the quality of the academic education process. To improve the management effectiveness of the academy, the Teaching Staff Management System (TSMS) proposes that all teacher processes be digitized. Since the BPMN approach can accurately describe the processes, it lacks a clear picture of the process flow map, something that the process mining approach has, which is extracting information from event logs for discovery, monitoring, and model enhancement. Therefore, these two methodologies were combined to create the most accurate representation of system operations, the ability to extract data records and mining processes, recreate them in the form of a Petri net, and then generate them in a BPMN model for a more in-depth view of process flow. Additionally, the TSMS processes will be orchestrated to handle all requests in a guaranteed small-time manner thanks to the integration of the Google Cloud Platform (GCP), the BPM engine, and allowing business owners to take part throughout the entire TSMS project development lifecycle.Keywords: process mining, BPM, business process model and notation, Petri net, teaching staff, Google Cloud Platform
Procedia PDF Downloads 14226923 Urban Big Data: An Experimental Approach to Building-Value Estimation Using Web-Based Data
Authors: Sun-Young Jang, Sung-Ah Kim, Dongyoun Shin
Abstract:
Current real-estate value estimation, difficult for laymen, usually is performed by specialists. This paper presents an automated estimation process based on big data and machine-learning technology that calculates influences of building conditions on real-estate price measurement. The present study analyzed actual building sales sample data for Nonhyeon-dong, Gangnam-gu, Seoul, Korea, measuring the major influencing factors among the various building conditions. Further to that analysis, a prediction model was established and applied using RapidMiner Studio, a graphical user interface (GUI)-based tool for derivation of machine-learning prototypes. The prediction model is formulated by reference to previous examples. When new examples are applied, it analyses and predicts accordingly. The analysis process discerns the crucial factors effecting price increases by calculation of weighted values. The model was verified, and its accuracy determined, by comparing its predicted values with actual price increases.Keywords: apartment complex, big data, life-cycle building value analysis, machine learning
Procedia PDF Downloads 37426922 Blockchain Technology Security Evaluation: Voting System Based on Blockchain
Authors: Omid Amini
Abstract:
Nowadays, technology plays the most important role in the life of human beings because people use technology to share data and to communicate with each other, but the challenge is the security of this data. For instance, as more people turn to technology in the world, more data is generated, and more hackers try to steal or infiltrate data. In addition, the data is under the control of the central authority, which can trigger the challenge of losing information and changing information; this can create widespread anxiety for different people in different communities. In this paper, we sought to investigate Blockchain technology that can guarantee information security and eliminate the challenge of central authority access to information. Now a day, people are suffering from the current voting system. This means that the lack of transparency in the voting system is a big problem for society and the government in most countries, but blockchain technology can be the best alternative to the previous voting system methods because it removes the most important challenge for voting. According to the results, this research can be a good start to getting acquainted with this new technology, especially on the security part and familiarity with how to use a voting system based on blockchain in the world. At the end of this research, it is concluded that the use of blockchain technology can solve the major security problem and lead to a secure and transparent election.Keywords: blockchain, technology, security, information, voting system, transparency
Procedia PDF Downloads 13226921 The Role of Zakat on Sustainable Economic Development by Rumah Zakat
Authors: Selamat Muliadi
Abstract:
This study aimed to explain conceptual the role of Zakat on sustainable economic development by Rumah Zakat. Rumah Zakat is a philanthropic institution that manages zakat and other social funds through community empowerment programs. In running the program, including economic empowerment and socio health services are designed for these recipients. Rumah Zakat's connection with the establisment of Sustainable Development Goals (SDGs) which is to help impoverished recipients economically and socially. It’s an important agenda that the government input into national development, even the region. The primary goal of Zakat on sustainable economic development, not only limited to economic variables but based on Islamic principles, has comprehensive characteristics. The characteristics include moral, material, spiritual, and social aspects. In other words, sustainable economic development is closely related to improving people’s living standard (Mustahiq). The findings provide empiricial evidence regarding the positive contribution and effectiveness of zakat targeting in reducing poverty and improve the welfare of people related with the management of zakat. The purpose of this study was to identify the role of Zakat on sustainable economic development, which was applied by Rumah Zakat. This study used descriptive method and qualitative analysis. The data source was secondary data collected from documents and texts related to the research topic, be it books, articles, newspapers, journals, or others. The results showed that the role of zakat on sustainable economic development by Rumah Zakat has been quite good and in accordance with the principle of Islamic economics. Rumah Zakat programs are adapted to support intended development. The contribution of the productive program implementation has been aligned with four goals in the Sustainable Development Goals, i.e., Senyum Juara (Quality Education), Senyum Lestari (Clean Water and Sanitation), Senyum Mandiri (Entrepreneur Program) and Senyum Sehat (Free Maternity Clinic). The performance of zakat in the sustainable economic empowerment community at Rumah Zakat is taking into account dimensions such as input, process, output, and outcome.Keywords: Zakat, social welfare, sustainable economic development, charity
Procedia PDF Downloads 13626920 Study of Natural Patterns on Digital Image Correlation Using Simulation Method
Authors: Gang Li, Ghulam Mubashar Hassan, Arcady Dyskin, Cara MacNish
Abstract:
Digital image correlation (DIC) is a contactless full-field displacement and strain reconstruction technique commonly used in the field of experimental mechanics. Comparing with physical measuring devices, such as strain gauges, which only provide very restricted coverage and are expensive to deploy widely, the DIC technique provides the result with full-field coverage and relative high accuracy using an inexpensive and simple experimental setup. It is very important to study the natural patterns effect on the DIC technique because the preparation of the artificial patterns is time consuming and hectic process. The objective of this research is to study the effect of using images having natural pattern on the performance of DIC. A systematical simulation method is used to build simulated deformed images used in DIC. A parameter (subset size) used in DIC can have an effect on the processing and accuracy of DIC and even cause DIC to failure. Regarding to the picture parameters (correlation coefficient), the higher similarity of two subset can lead the DIC process to fail and make the result more inaccurate. The pictures with good and bad quality for DIC methods have been presented and more importantly, it is a systematic way to evaluate the quality of the picture with natural patterns before they install the measurement devices.Keywords: Digital Image Correlation (DIC), deformation simulation, natural pattern, subset size
Procedia PDF Downloads 42026919 Phenotype Prediction of DNA Sequence Data: A Machine and Statistical Learning Approach
Authors: Mpho Mokoatle, Darlington Mapiye, James Mashiyane, Stephanie Muller, Gciniwe Dlamini
Abstract:
Great advances in high-throughput sequencing technologies have resulted in availability of huge amounts of sequencing data in public and private repositories, enabling a holistic understanding of complex biological phenomena. Sequence data are used for a wide range of applications such as gene annotations, expression studies, personalized treatment and precision medicine. However, this rapid growth in sequence data poses a great challenge which calls for novel data processing and analytic methods, as well as huge computing resources. In this work, a machine and statistical learning approach for DNA sequence classification based on $k$-mer representation of sequence data is proposed. The approach is tested using whole genome sequences of Mycobacterium tuberculosis (MTB) isolates to (i) reduce the size of genomic sequence data, (ii) identify an optimum size of k-mers and utilize it to build classification models, (iii) predict the phenotype from whole genome sequence data of a given bacterial isolate, and (iv) demonstrate computing challenges associated with the analysis of whole genome sequence data in producing interpretable and explainable insights. The classification models were trained on 104 whole genome sequences of MTB isoloates. Cluster analysis showed that k-mers maybe used to discriminate phenotypes and the discrimination becomes more concise as the size of k-mers increase. The best performing classification model had a k-mer size of 10 (longest k-mer) an accuracy, recall, precision, specificity, and Matthews Correlation coeffient of 72.0%, 80.5%, 80.5%, 63.6%, and 0.4 respectively. This study provides a comprehensive approach for resampling whole genome sequencing data, objectively selecting a k-mer size, and performing classification for phenotype prediction. The analysis also highlights the importance of increasing the k-mer size to produce more biological explainable results, which brings to the fore the interplay that exists amongst accuracy, computing resources and explainability of classification results. However, the analysis provides a new way to elucidate genetic information from genomic data, and identify phenotype relationships which are important especially in explaining complex biological mechanisms.Keywords: AWD-LSTM, bootstrapping, k-mers, next generation sequencing
Procedia PDF Downloads 16726918 Phenotype Prediction of DNA Sequence Data: A Machine and Statistical Learning Approach
Authors: Darlington Mapiye, Mpho Mokoatle, James Mashiyane, Stephanie Muller, Gciniwe Dlamini
Abstract:
Great advances in high-throughput sequencing technologies have resulted in availability of huge amounts of sequencing data in public and private repositories, enabling a holistic understanding of complex biological phenomena. Sequence data are used for a wide range of applications such as gene annotations, expression studies, personalized treatment and precision medicine. However, this rapid growth in sequence data poses a great challenge which calls for novel data processing and analytic methods, as well as huge computing resources. In this work, a machine and statistical learning approach for DNA sequence classification based on k-mer representation of sequence data is proposed. The approach is tested using whole genome sequences of Mycobacterium tuberculosis (MTB) isolates to (i) reduce the size of genomic sequence data, (ii) identify an optimum size of k-mers and utilize it to build classification models, (iii) predict the phenotype from whole genome sequence data of a given bacterial isolate, and (iv) demonstrate computing challenges associated with the analysis of whole genome sequence data in producing interpretable and explainable insights. The classification models were trained on 104 whole genome sequences of MTB isoloates. Cluster analysis showed that k-mers maybe used to discriminate phenotypes and the discrimination becomes more concise as the size of k-mers increase. The best performing classification model had a k-mer size of 10 (longest k-mer) an accuracy, recall, precision, specificity, and Matthews Correlation coeffient of 72.0 %, 80.5 %, 80.5 %, 63.6 %, and 0.4 respectively. This study provides a comprehensive approach for resampling whole genome sequencing data, objectively selecting a k-mer size, and performing classification for phenotype prediction. The analysis also highlights the importance of increasing the k-mer size to produce more biological explainable results, which brings to the fore the interplay that exists amongst accuracy, computing resources and explainability of classification results. However, the analysis provides a new way to elucidate genetic information from genomic data, and identify phenotype relationships which are important especially in explaining complex biological mechanismsKeywords: AWD-LSTM, bootstrapping, k-mers, next generation sequencing
Procedia PDF Downloads 15926917 Study on the Contributions and Social Validity of an Online Autism Training for School Staff
Authors: Myriam Rousseau, Suzie McKinnon, Mathieu Mireault, Anaïs V. Berthiaume, Marie-Hélène Poulin, Jacinthe Bourassa, Louis-Simon Maltais
Abstract:
The increasing presence of young people with autism is forcing schools to adapt to this new situation and to offer services that meet the needs of this clientele. However, school staff often feels unqualified to support these students, lacking the preparation, skills and training to meet their needs. Continuing education for these staff is therefore essential to ensure that they can meet the needs of these students. As a result, the Government of Quebec has developed a bilingual (French and English) online training on autism specific to the needs of school staff. Therefore, adequate training for all school staff is likely to provide quality learning opportunities for these students. The research project focuses on the participants' appreciation, contributions, and social validity of the training. More specifically, it aims to: 1) evaluate the knowledge and self-efficacy of the participants, 2) evaluate the social validity and 3) document the evaluation of the ergonomics of the platform hosting the training. The evaluation carried out as part of this descriptive study uses a quantitative method. Data are collected using questionnaires completed online. The analysis of preliminary data reveals that participants' knowledge of autism and their sense of self-efficacy increased significantly. They value the training positively and consider it to be acceptable, appropriate, and suitable. The participants find it important for school staff to take this training. Almost all the items measuring the ergonomics of the platform have averages above 4.57/5. In general, the study shows that the training allows participating of the trainee school staff to improve their knowledge of autism and their sense of self-efficacy with young people with autism. In addition, participants recognize that the training has good social validity and appreciate the online modality. However, these results should be interpreted with caution given the limited number of participants who completed the research project. It is therefore important to continue the research with a larger number of participants to allow an adequate and general representativeness of the social validity, the feeling of competence and the appreciation of the platform.Keywords: autism, online training, school staff, social validity
Procedia PDF Downloads 3726916 Socio-Cultural and Religious Contributions to Gender Wage Gap: A Meta-Analysis
Authors: R. Alothaim, T. Mishra
Abstract:
Different researchers have reviewed the gender wage gap since early days between women and men to point out their difference to help bring about equality in production among them. Many fingers have been pointed out towards culture and religion as one of the major factors contributing to the gender wage gap throughout the years passed. Recent research has been done to give out equalization to this gap between men and women. The gender wage gap has raised serious concerns among nations and societies. Additionally, data, methodology and time periods have been affected by the gender wage gap, thus needing special decision making to help in the meta-study in the provision of quantitative review. Quality indicators have played a crucial role towards the education through stressing on enough consideration to help give a solution of equality and worth in the research study. The different research reviewed have given enough evidence and impact to point out that the major causes of this gender wage gap has resulted due to culture. On the other pedestal, religion may play a role to the issues of gender wage gap but with more emphasis on culture playing the bigger part. Furthermore, social status of individual has contributed to the wage gap difference between men and women. Labor market has played a vital role in empowering women, leading to the lower rate of the raw wage difference in the recent years.Keywords: culture, gender wage gap, social, religion
Procedia PDF Downloads 12026915 Synchronous Versus Asynchronous Telecollaboration in Intercultural Communication
Authors: Vita Kalnberzina, Lauren Miller Anderson
Abstract:
The aim of the paper is to report on the results of the telecollaboration project results carried out between the students of the University of Latvia, National Louis University in the US, and Austral University in Chili during the Intercultural Communication course. The objectives of the study are 1) to compare different forms of student telecollaboration and virtual exchange, 2) to collect and analyse the student feedback on the telecollaboration project, 3) to evaluate the products (films) produced during the telecollaboration project. The methods of research used are as follows: Survey of the student feedback after the project, video text analysis of the films produced by the students, and interview of the students participating in the project. We would like to compare the results of a three-year collaboration project, where we tried out synchronous telecollaboration and asynchronous collaboration. The different variables that were observed were the impact of the different time zones, different language proficiency levels of students, and different curricula developed for collaboration. The main findings suggest that the effort spent by students to organize meetings in different time zones and to get to know each other diminishes the quality of the product developed and thus reduces the students' feeling of accomplishment. Therefore, we would like to propose that asynchronous collaboration where the national teams work on a film project specifically developed by the students of one university for the students of another university ends up with a better quality film, which in its turn appeals more to the students of the other university and creates a deeper intercultural bond between the collaborating students.Keywords: telecollaboration, intercultural communication, synchronous collaboration, asynchronous collaboration
Procedia PDF Downloads 10126914 Characterization of Forest Fire Fuel in Shivalik Himalayas Using Hyperspectral Remote Sensing
Authors: Neha Devi, P. K. Joshi
Abstract:
Fire fuel map is one of the most critical factors for planning and managing the fire hazard and risk. One of the most significant forms of global disturbance, impacting community dynamics, biogeochemical cycles and local and regional climate across a wide range of ecosystems ranging from boreal forests to tropical rainforest is wildfire Assessment of fire danger is a function of forest type, fuelwood stock volume, moisture content, degree of senescence and fire management strategy adopted in the ground. Remote sensing has potential of reduction the uncertainty in mapping fuels. Hyperspectral remote sensing is emerging to be a very promising technology for wildfire fuels characterization. Fine spectral information also facilitates mapping of biophysical and chemical information that is directly related to the quality of forest fire fuels including above ground live biomass, canopy moisture, etc. We used Hyperion imagery acquired in February, 2016 and analysed four fuel characteristics using Hyperion sensor data on-board EO-1 satellite, acquired over the Shiwalik Himalayas covering the area of Champawat, Uttarakhand state. The main objective of this study was to present an overview of methodologies for mapping fuel properties using hyperspectral remote sensing data. Fuel characteristics analysed include fuel biomass, fuel moisture, and fuel condition and fuel type. Fuel moisture and fuel biomass were assessed through the expression of the liquid water bands. Fuel condition and type was assessed using green vegetation, non-photosynthetic vegetation and soil as Endmember for spectral mixture analysis. Linear Spectral Unmixing, a partial spectral unmixing algorithm, was used to identify the spectral abundance of green vegetation, non-photosynthetic vegetation and soil.Keywords: forest fire fuel, Hyperion, hyperspectral, linear spectral unmixing, spectral mixture analysis
Procedia PDF Downloads 16526913 Integration of Acoustic Solutions for Classrooms
Authors: Eyibo Ebengeobong Eddie, Halil Zafer Alibaba
Abstract:
The neglect of classroom acoustics is dominant in most educational facilities, meanwhile, hearing and listening is the learning process in this kind of facilities. A classroom should therefore be an environment that encourages listening, without an obstacles to understanding what is being taught. Although different studies have shown teachers to complain that noise is the everyday factor that causes stress in classroom, the capacity of individuals to understand speech is further affected by Echoes, Reverberation, and room modes. It is therefore necessary for classrooms to have an ideal acoustics to aid the intelligibility of students in the learning process. The influence of these acoustical parameters on learning and teaching in schools needs to be further researched upon to enhance the teaching and learning capacity of both teacher and student. For this reason, there is a strong need to provide and collect data to analyse and define the suitable quality of classrooms needed for a learning environment. Research has shown that acoustical problems are still experienced in both newer and older schools. However, recently, principle of acoustics has been analysed and room acoustics can now be measured with various technologies and sound systems to improve and solve the problem of acoustics in classrooms. These acoustic solutions, materials, construction methods and integration processes would be discussed in this paper.Keywords: classroom, acoustics, materials, integration, speech intelligibility
Procedia PDF Downloads 417