Search results for: well data integration
25389 An Analysis of Humanitarian Data Management of Polish Non-Governmental Organizations in Ukraine Since February 2022 and Its Relevance for Ukrainian Humanitarian Data Ecosystem
Authors: Renata Kurpiewska-Korbut
Abstract:
Making an assumption that the use and sharing of data generated in humanitarian action constitute a core function of humanitarian organizations, the paper analyzes the position of the largest Polish humanitarian non-governmental organizations in the humanitarian data ecosystem in Ukraine and their approach to non-personal and personal data management since February of 2022. Both expert interviews and document analysis of non-profit organizations providing a direct response in the Ukrainian crisis context, i.e., the Polish Humanitarian Action, Caritas, Polish Medical Mission, Polish Red Cross, and the Polish Center for International Aid and the applicability of theoretical perspective of contingency theory – with its central point that the context or specific set of conditions determining the way of behavior and the choice of methods of action – help to examine the significance of data complexity and adaptive approach to data management by relief organizations in the humanitarian supply chain network. The purpose of this study is to determine how the existence of well-established and accurate internal procedures and good practices of using and sharing data (including safeguards for sensitive data) by the surveyed organizations with comparable human and technological capabilities are implemented and adjusted to Ukrainian humanitarian settings and data infrastructure. The study also poses a fundamental question of whether this crisis experience will have a determining effect on their future performance. The obtained finding indicate that Polish humanitarian organizations in Ukraine, which have their own unique code of conduct and effective managerial data practices determined by contingencies, have limited influence on improving the situational awareness of other assistance providers in the data ecosystem despite their attempts to undertake interagency work in the area of data sharing.Keywords: humanitarian data ecosystem, humanitarian data management, polish NGOs, Ukraine
Procedia PDF Downloads 9225388 An Approach for Estimation in Hierarchical Clustered Data Applicable to Rare Diseases
Authors: Daniel C. Bonzo
Abstract:
Practical considerations lead to the use of unit of analysis within subjects, e.g., bleeding episodes or treatment-related adverse events, in rare disease settings. This is coupled with data augmentation techniques such as extrapolation to enlarge the subject base. In general, one can think about extrapolation of data as extending information and conclusions from one estimand to another estimand. This approach induces hierarchichal clustered data with varying cluster sizes. Extrapolation of clinical trial data is being accepted increasingly by regulatory agencies as a means of generating data in diverse situations during drug development process. Under certain circumstances, data can be extrapolated to a different population, a different but related indication, and different but similar product. We consider here the problem of estimation (point and interval) using a mixed-models approach under an extrapolation. It is proposed that estimators (point and interval) be constructed using weighting schemes for the clusters, e.g., equally weighted and with weights proportional to cluster size. Simulated data generated under varying scenarios are then used to evaluate the performance of this approach. In conclusion, the evaluation result showed that the approach is a useful means for improving statistical inference in rare disease settings and thus aids not only signal detection but risk-benefit evaluation as well.Keywords: clustered data, estimand, extrapolation, mixed model
Procedia PDF Downloads 13625387 Conceptual Modeling of the Relationship between Project Management Practices and Knowledge Absorptive Capacity Using Interpretive Structural Modeling Method
Authors: Seyed Abdolreza Mosavi, Alireza Babakhan, Elham Sadat Hoseinifard
Abstract:
Knowledge-based firms need to design mechanisms for continuous absorptive and creation of knowledge in order to ensure their survival in the competitive arena and to follow the path of development. Considering the project-oriented nature of product development activities in knowledge-based firms on the one hand and the importance of analyzing the factors affecting knowledge absorptive capacity in these firms on the other, the purpose of this study is to identify and classify the factors affecting project management practices on absorptive knowledge capacity. For this purpose, we have studied and reviewed the theoretical literature in the field of project management and absorptive knowledge capacity so as to clarify its dimensions and indexes. Then, using the ISM method, the relationship between them has been studied. To collect data, 21 questionnaires were distributed in project-oriented knowledge-based companies. The results of the ISM method analysis provide a model for the relationship between project management activities and knowledge absorptive capacity, which includes knowledge acquisition capacity, scope management, time management, cost management, quality management, human resource management, communications management, procurement management, risk management, stakeholders management and integration management. Having conducted the MICMAC analysis, we divided the variables into three groups of independent, relational and dependent variables and came up with no variables to be included in the group of autonomous variables.Keywords: knowledge absorptive capacity, project management practices, knowledge-based firms, interpretive structural modeling
Procedia PDF Downloads 19725386 Indigenous Knowledge Management: Towards Identification of Challenges and Opportunities in Developing Countries
Authors: Desmond Chinedu Oparaku, Emmanuel Uwazie Anyanwu, Oyemike Victor Benson, Ogbonna Isaac-Nnadimele
Abstract:
The purpose of this paper is to provide a theoretical discourse that highlights the challenges associated with management of indigenous knowledge with reference to developing countries. Literature review and brainstorming were used to collect relevant data and draw inferences. The findings indicate that non-existence of indigenous knowledge management policy (IKMP), low level of partnership drive among library and information services providers, non-uniformity of format and content of indigenous knowledge, inadequate funding, and lack of access to ICTs, lack of indigenous people with indigenous expertise and hoarding of knowledge as challenges to indigenous knowledge management. The study is based on literature review and information gathered through brain storming with professional colleagues the geographic scope as developing countries. The study has birth several implication based on the findings made. Professionally, it has necessitated the need for formulating a viable indigenous knowledge management policy (IKMP), creating of collaborative network through partnership, and integration of ICTs to indigenous knowledge management practices by libraries in developing countries etc. The originality of this paper is revealed in its capability as serving as an eye opener to librarians on the need for preserving and managing indigenous knowledge in developing countries. It further unlocks the possibilities of exploring empirical based researches to substantiate the theoretical issues raised in this paper. The findings may be used by library managers to improve indigenous knowledge management (IKM).Keywords: developing countries, ICTs, indigenous knowledge, knowledge management
Procedia PDF Downloads 34125385 Authorization of Commercial Communication Satellite Grounds for Promoting Turkish Data Relay System
Authors: Celal Dudak, Aslı Utku, Burak Yağlioğlu
Abstract:
Uninterrupted and continuous satellite communication through the whole orbit time is becoming more indispensable every day. Data relay systems are developed and built for various high/low data rate information exchanges like TDRSS of USA and EDRSS of Europe. In these missions, a couple of task-dedicated communication satellites exist. In this regard, for Turkey a data relay system is attempted to be defined exchanging low data rate information (i.e. TTC) for Earth-observing LEO satellites appointing commercial GEO communication satellites all over the world. First, justification of this attempt is given, demonstrating duration enhancements in the link. Discussion of preference of RF communication is, also, given instead of laser communication. Then, preferred communication GEOs – including TURKSAT4A already belonging to Turkey- are given, together with the coverage enhancements through STK simulations and the corresponding link budget. Also, a block diagram of the communication system is given on the LEO satellite.Keywords: communication, GEO satellite, data relay system, coverage
Procedia PDF Downloads 44225384 The Development of Encrypted Near Field Communication Data Exchange Format Transmission in an NFC Passive Tag for Checking the Genuine Product
Authors: Tanawat Hongthai, Dusit Thanapatay
Abstract:
This paper presents the development of encrypted near field communication (NFC) data exchange format transmission in an NFC passive tag for the feasibility of implementing a genuine product authentication. We propose a research encryption and checking the genuine product into four major categories; concept, infrastructure, development and applications. This result shows the passive NFC-forum Type 2 tag can be configured to be compatible with the NFC data exchange format (NDEF), which can be automatically partially data updated when there is NFC field.Keywords: near field communication, NFC data exchange format, checking the genuine product, encrypted NFC
Procedia PDF Downloads 28025383 The Use of Artificial Intelligence in Diagnosis of Mastitis in Cows
Authors: Djeddi Khaled, Houssou Hind, Miloudi Abdellatif, Rabah Siham
Abstract:
In the field of veterinary medicine, there is a growing application of artificial intelligence (AI) for diagnosing bovine mastitis, a prevalent inflammatory disease in dairy cattle. AI technologies, such as automated milking systems, have streamlined the assessment of key metrics crucial for managing cow health during milking and identifying prevalent diseases, including mastitis. These automated milking systems empower farmers to implement automatic mastitis detection by analyzing indicators like milk yield, electrical conductivity, fat, protein, lactose, blood content in the milk, and milk flow rate. Furthermore, reports highlight the integration of somatic cell count (SCC), thermal infrared thermography, and diverse systems utilizing statistical models and machine learning techniques, including artificial neural networks, to enhance the overall efficiency and accuracy of mastitis detection. According to a review of 15 publications, machine learning technology can predict the risk and detect mastitis in cattle with an accuracy ranging from 87.62% to 98.10% and sensitivity and specificity ranging from 84.62% to 99.4% and 81.25% to 98.8%, respectively. Additionally, machine learning algorithms and microarray meta-analysis are utilized to identify mastitis genes in dairy cattle, providing insights into the underlying functional modules of mastitis disease. Moreover, AI applications can assist in developing predictive models that anticipate the likelihood of mastitis outbreaks based on factors such as environmental conditions, herd management practices, and animal health history. This proactive approach supports farmers in implementing preventive measures and optimizing herd health. By harnessing the power of artificial intelligence, the diagnosis of bovine mastitis can be significantly improved, enabling more effective management strategies and ultimately enhancing the health and productivity of dairy cattle. The integration of artificial intelligence presents valuable opportunities for the precise and early detection of mastitis, providing substantial benefits to the dairy industry.Keywords: artificial insemination, automatic milking system, cattle, machine learning, mastitis
Procedia PDF Downloads 6525382 Data Hiding by Vector Quantization in Color Image
Authors: Yung Gi Wu
Abstract:
With the growing of computer and network, digital data can be spread to anywhere in the world quickly. In addition, digital data can also be copied or tampered easily so that the security issue becomes an important topic in the protection of digital data. Digital watermark is a method to protect the ownership of digital data. Embedding the watermark will influence the quality certainly. In this paper, Vector Quantization (VQ) is used to embed the watermark into the image to fulfill the goal of data hiding. This kind of watermarking is invisible which means that the users will not conscious the existing of embedded watermark even though the embedded image has tiny difference compared to the original image. Meanwhile, VQ needs a lot of computation burden so that we adopt a fast VQ encoding scheme by partial distortion searching (PDS) and mean approximation scheme to speed up the data hiding process. The watermarks we hide to the image could be gray, bi-level and color images. Texts are also can be regarded as watermark to embed. In order to test the robustness of the system, we adopt Photoshop to fulfill sharpen, cropping and altering to check if the extracted watermark is still recognizable. Experimental results demonstrate that the proposed system can resist the above three kinds of tampering in general cases.Keywords: data hiding, vector quantization, watermark, color image
Procedia PDF Downloads 36425381 Anomaly Detection in a Data Center with a Reconstruction Method Using a Multi-Autoencoders Model
Authors: Victor Breux, Jérôme Boutet, Alain Goret, Viviane Cattin
Abstract:
Early detection of anomalies in data centers is important to reduce downtimes and the costs of periodic maintenance. However, there is little research on this topic and even fewer on the fusion of sensor data for the detection of abnormal events. The goal of this paper is to propose a method for anomaly detection in data centers by combining sensor data (temperature, humidity, power) and deep learning models. The model described in the paper uses one autoencoder per sensor to reconstruct the inputs. The auto-encoders contain Long-Short Term Memory (LSTM) layers and are trained using the normal samples of the relevant sensors selected by correlation analysis. The difference signal between the input and its reconstruction is then used to classify the samples using feature extraction and a random forest classifier. The data measured by the sensors of a data center between January 2019 and May 2020 are used to train the model, while the data between June 2020 and May 2021 are used to assess it. Performances of the model are assessed a posteriori through F1-score by comparing detected anomalies with the data center’s history. The proposed model outperforms the state-of-the-art reconstruction method, which uses only one autoencoder taking multivariate sequences and detects an anomaly with a threshold on the reconstruction error, with an F1-score of 83.60% compared to 24.16%.Keywords: anomaly detection, autoencoder, data centers, deep learning
Procedia PDF Downloads 19425380 The Dimensions of Culture in the Productive Internationalization Process: An Overview about Brazilian Companies in Bolivia
Authors: Renato Dias Baptista
Abstract:
The purpose of this paper is to analyze the elements of the cultural dimension in the internationalization process of Brazilian companies in Bolivia. This paper is based on research on two major Brazilian transnational companies which have plants in Bolivia. To achieve the objectives, the interconnective characteristics of culture in the process of productive internationalization were analyzed aiming to highlight it as a guiding element opposite the premises of the Brazilian leadership in the integration and development of the continent. The analysis aims to give relevance to the culture of a country and its relations with internationalization.Keywords: culture, transnational, internationalization, Bolivia, Brazil
Procedia PDF Downloads 42125379 Organizational Culture of a Public and a Private Hospital in Brazil
Authors: Fernanda Ludmilla Rossi Rocha, Thamiris Cavazzani Vegro, Silvia Helena Henriques Camelo, Carmen Silvia Gabriel, Andrea Bernardes
Abstract:
Introduction: Organizations are cultural, symbolic and imaginary systems composed by values and norms. These values and norms represent the organizational culture, which determines the behavior of the workers, guides the work practices and impacts the quality of care and the safety culture of health services worldwide. Objective: To analyze the organizational culture of a public and a private hospital in Brazil. Method: Descriptive study with quantitative approach developed in a public and in a private hospital of Brazil. Sample was composed by 281 nursing workers, of which 73 nurses and 208 nursing auxiliaries and technicians. The data collection instrument comprised the Brazilian Instrument for Assessing Organizational Culture. Data were collected from March to December 2013. Results: At the public hospital, the results showed an average score of 2.85 for the values concerning cooperative professionalism (CP); 3.02 for values related to hierarchical rigidity and the centralization of power (HR); 2.23 for individualistic professionalism and competition at work (IP); 2.22 for values related to satisfaction, well-being and motivation of workers (SW); 3.47 for external integration (EI); 2.03 for rewarding and training practices (RT); 2.75 for practices related to the promotion of interpersonal relationships (IR) About the private hospital, the results showed an average score of 3.24 for the CP; 2.83 for HR; 2.69 for IP; 2.71 for SW; 3.73 for EI; 2.56 for RT; 2.83 for IR at the hospital. Discussion: The analysis of organizational values of the studied hospitals shows that workers find the existence of hierarchical rigidity and the centralization of power in the institutions; believed there was cooperation at workplace, though they perceived individualism and competition; believed that values associated with the workers’ well-being, satisfaction and motivation were seldom acknowledged by the hospital; believed in the adoption of strategic planning actions within the institution, but considered interpersonal relationship promotion, continuous education and the rewarding of workers to be little valued by the institution. Conclusion: This work context can lead to professional dissatisfaction, compromising the quality of care and contributing to the occurrence of occupational diseases.Keywords: nursing management, organizational culture, quality of care, interpersonal relationships
Procedia PDF Downloads 44025378 Transforming Data into Knowledge: Mathematical and Statistical Innovations in Data Analytics
Authors: Zahid Ullah, Atlas Khan
Abstract:
The rapid growth of data in various domains has created a pressing need for effective methods to transform this data into meaningful knowledge. In this era of big data, mathematical and statistical innovations play a crucial role in unlocking insights and facilitating informed decision-making in data analytics. This abstract aims to explore the transformative potential of these innovations and their impact on converting raw data into actionable knowledge. Drawing upon a comprehensive review of existing literature, this research investigates the cutting-edge mathematical and statistical techniques that enable the conversion of data into knowledge. By evaluating their underlying principles, strengths, and limitations, we aim to identify the most promising innovations in data analytics. To demonstrate the practical applications of these innovations, real-world datasets will be utilized through case studies or simulations. This empirical approach will showcase how mathematical and statistical innovations can extract patterns, trends, and insights from complex data, enabling evidence-based decision-making across diverse domains. Furthermore, a comparative analysis will be conducted to assess the performance, scalability, interpretability, and adaptability of different innovations. By benchmarking against established techniques, we aim to validate the effectiveness and superiority of the proposed mathematical and statistical innovations in data analytics. Ethical considerations surrounding data analytics, such as privacy, security, bias, and fairness, will be addressed throughout the research. Guidelines and best practices will be developed to ensure the responsible and ethical use of mathematical and statistical innovations in data analytics. The expected contributions of this research include advancements in mathematical and statistical sciences, improved data analysis techniques, enhanced decision-making processes, and practical implications for industries and policymakers. The outcomes will guide the adoption and implementation of mathematical and statistical innovations, empowering stakeholders to transform data into actionable knowledge and drive meaningful outcomes.Keywords: data analytics, mathematical innovations, knowledge extraction, decision-making
Procedia PDF Downloads 7525377 FCNN-MR: A Parallel Instance Selection Method Based on Fast Condensed Nearest Neighbor Rule
Authors: Lu Si, Jie Yu, Shasha Li, Jun Ma, Lei Luo, Qingbo Wu, Yongqi Ma, Zhengji Liu
Abstract:
Instance selection (IS) technique is used to reduce the data size to improve the performance of data mining methods. Recently, to process very large data set, several proposed methods divide the training set into some disjoint subsets and apply IS algorithms independently to each subset. In this paper, we analyze the limitation of these methods and give our viewpoint about how to divide and conquer in IS procedure. Then, based on fast condensed nearest neighbor (FCNN) rule, we propose a large data sets instance selection method with MapReduce framework. Besides ensuring the prediction accuracy and reduction rate, it has two desirable properties: First, it reduces the work load in the aggregation node; Second and most important, it produces the same result with the sequential version, which other parallel methods cannot achieve. We evaluate the performance of FCNN-MR on one small data set and two large data sets. The experimental results show that it is effective and practical.Keywords: instance selection, data reduction, MapReduce, kNN
Procedia PDF Downloads 25325376 Experimental Evaluation of Succinct Ternary Tree
Authors: Dmitriy Kuptsov
Abstract:
Tree data structures, such as binary or in general k-ary trees, are essential in computer science. The applications of these data structures can range from data search and retrieval to sorting and ranking algorithms. Naive implementations of these data structures can consume prohibitively large volumes of random access memory limiting their applicability in certain solutions. Thus, in these cases, more advanced representation of these data structures is essential. In this paper we present the design of the compact version of ternary tree data structure and demonstrate the results for the experimental evaluation using static dictionary problem. We compare these results with the results for binary and regular ternary trees. The conducted evaluation study shows that our design, in the best case, consumes up to 12 times less memory (for the dictionary used in our experimental evaluation) than a regular ternary tree and in certain configuration shows performance comparable to regular ternary trees. We have evaluated the performance of the algorithms using both 32 and 64 bit operating systems.Keywords: algorithms, data structures, succinct ternary tree, per- formance evaluation
Procedia PDF Downloads 16025375 Bio-Hub Ecosystems: Profitability through Circularity for Sustainable Forestry, Energy, Agriculture and Aquaculture
Authors: Kimberly Samaha
Abstract:
The Bio-Hub Ecosystem model was developed to address a critical area of concern within the global energy market regarding biomass as a feedstock for power plants. Yet the lack of an economically-viable business model for bioenergy facilities has resulted in the continuation of idled and decommissioned plants. This study analyzed data and submittals to the Born Global Maine Innovation Challenge. The Innovation Challenge was a global innovation challenge to identify process innovations that could address a ‘whole-tree’ approach of maximizing the products, byproducts, energy value and process slip-streams into a circular zero-waste design. Participating companies were at various stages of developing bioproducts and included biofuels, lignin-based products, carbon capture platforms and biochar used as both a filtration medium and as a soil amendment product. This case study shows the QCA (Qualitative Comparative Analysis) methodology of the prequalification process and the resulting techno-economic model that was developed for the maximizing profitability of the Bio-Hub Ecosystem through continuous expansion of system waste streams into valuable process inputs for co-hosts. A full site plan for the integration of co-hosts (biorefinery, land-based shrimp and salmon aquaculture farms, a tomato green-house and a hops farm) at an operating forestry-based biomass to energy plant in West Enfield, Maine USA. This model and process for evaluating the profitability not only proposes models for integration of forestry, aquaculture and agriculture in cradle-to-cradle linkages of what have typically been linear systems, but the proposal also allows for the early measurement of the circularity and impact of resource use and investment risk mitigation, for these systems. In this particular study, profitability is assessed at two levels CAPEX (Capital Expenditures) and in OPEX (Operating Expenditures). Given that these projects start with repurposing facilities where the industrial level infrastructure is already built, permitted and interconnected to the grid, the addition of co-hosts first realizes a dramatic reduction in permitting, development times and costs. In addition, using the biomass energy plant’s waste streams such as heat, hot water, CO₂ and fly ash as valuable inputs to their operations and a significant decrease in the OPEX costs, increasing overall profitability to each of the co-hosts bottom line. This case study utilizes a proprietary techno-economic model to demonstrate how utilizing waste streams of a biomass energy plant and/or biorefinery, results in significant reduction in OPEX for both the biomass plants and the agriculture and aquaculture co-hosts. Economically viable Bio-Hubs with favorable environmental and community impacts may prove critical in garnering local and federal government support for pilot programs and more wide-scale adoption, especially for those living in severely economically depressed rural areas where aging industrial sites have been shuttered and local economies devastated.Keywords: bio-economy, biomass energy, financing, zero-waste
Procedia PDF Downloads 13425374 Integrating Service Learning into a Business Analytics Course: A Comparative Investigation
Authors: Gokhan Egilmez, Erika Hatfield, Julie Turner
Abstract:
In this study, we investigated the impacts of service-learning integration on an undergraduate level business analytics course from multiple perspectives, including academic proficiency, community awareness, engagement, social responsibility, and reflection. We assessed the impact of the service-learning experience by using a survey developed primarily based on the literature review and secondarily on an ad hoc group of researchers. Then, we implemented the survey in two sections, where one of the sections was a control group. We compared the results of the empirical survey visually and statistically.Keywords: business analytics, service learning, experiential education, statistical analysis, survey research
Procedia PDF Downloads 11125373 Predicting Data Center Resource Usage Using Quantile Regression to Conserve Energy While Fulfilling the Service Level Agreement
Authors: Ahmed I. Alutabi, Naghmeh Dezhabad, Sudhakar Ganti
Abstract:
Data centers have been growing in size and dema nd continuously in the last two decades. Planning for the deployment of resources has been shallow and always resorted to over-provisioning. Data center operators try to maximize the availability of their services by allocating multiple of the needed resources. One resource that has been wasted, with little thought, has been energy. In recent years, programmable resource allocation has paved the way to allow for more efficient and robust data centers. In this work, we examine the predictability of resource usage in a data center environment. We use a number of models that cover a wide spectrum of machine learning categories. Then we establish a framework to guarantee the client service level agreement (SLA). Our results show that using prediction can cut energy loss by up to 55%.Keywords: machine learning, artificial intelligence, prediction, data center, resource allocation, green computing
Procedia PDF Downloads 10825372 Prosperous Digital Image Watermarking Approach by Using DCT-DWT
Authors: Prabhakar C. Dhavale, Meenakshi M. Pawar
Abstract:
In this paper, everyday tons of data is embedded on digital media or distributed over the internet. The data is so distributed that it can easily be replicated without error, putting the rights of their owners at risk. Even when encrypted for distribution, data can easily be decrypted and copied. One way to discourage illegal duplication is to insert information known as watermark, into potentially valuable data in such a way that it is impossible to separate the watermark from the data. These challenges motivated researchers to carry out intense research in the field of watermarking. A watermark is a form, image or text that is impressed onto paper, which provides evidence of its authenticity. Digital watermarking is an extension of the same concept. There are two types of watermarks visible watermark and invisible watermark. In this project, we have concentrated on implementing watermark in image. The main consideration for any watermarking scheme is its robustness to various attacksKeywords: watermarking, digital, DCT-DWT, security
Procedia PDF Downloads 42225371 A Comparison of Image Data Representations for Local Stereo Matching
Authors: André Smith, Amr Abdel-Dayem
Abstract:
The stereo matching problem, while having been present for several decades, continues to be an active area of research. The goal of this research is to find correspondences between elements found in a set of stereoscopic images. With these pairings, it is possible to infer the distance of objects within a scene, relative to the observer. Advancements in this field have led to experimentations with various techniques, from graph-cut energy minimization to artificial neural networks. At the basis of these techniques is a cost function, which is used to evaluate the likelihood of a particular match between points in each image. While at its core, the cost is based on comparing the image pixel data; there is a general lack of consistency as to what image data representation to use. This paper presents an experimental analysis to compare the effectiveness of more common image data representations. The goal is to determine the effectiveness of these data representations to reduce the cost for the correct correspondence relative to other possible matches.Keywords: colour data, local stereo matching, stereo correspondence, disparity map
Procedia PDF Downloads 37025370 Business-Intelligence Mining of Large Decentralized Multimedia Datasets with a Distributed Multi-Agent System
Authors: Karima Qayumi, Alex Norta
Abstract:
The rapid generation of high volume and a broad variety of data from the application of new technologies pose challenges for the generation of business-intelligence. Most organizations and business owners need to extract data from multiple sources and apply analytical methods for the purposes of developing their business. Therefore, the recently decentralized data management environment is relying on a distributed computing paradigm. While data are stored in highly distributed systems, the implementation of distributed data-mining techniques is a challenge. The aim of this technique is to gather knowledge from every domain and all the datasets stemming from distributed resources. As agent technologies offer significant contributions for managing the complexity of distributed systems, we consider this for next-generation data-mining processes. To demonstrate agent-based business intelligence operations, we use agent-oriented modeling techniques to develop a new artifact for mining massive datasets.Keywords: agent-oriented modeling (AOM), business intelligence model (BIM), distributed data mining (DDM), multi-agent system (MAS)
Procedia PDF Downloads 43225369 Acoustic Analysis of Psycho-Communication Disorders within Moroccan Students
Authors: Brahim Sabir
Abstract:
Psycho-Communication disorders negatively affect the academic curriculum for students in higher education. Thus, understanding these disorders, their causes and effects will give education specialists a tool for the decision, which will lead to the resolution of problems related to the integration of students with Psycho-Communication disorders. It is in this context that a statistical study was conducted, targeting the population object of study, namely Moroccan students. Pathological voice samples were recorded and analyzed acoustically with PRAAT software, in order to build a model that will be the basis for the objective diagnostic.Keywords: psycho-communication disorders, acoustic analysis, PRAAT
Procedia PDF Downloads 38925368 Educational System in Developing Countries and E-learning Evaluation in the Face of COVID Pandemic
Authors: Timothy Wale Olaosebikan
Abstract:
The adverse effect of the Covid-19 outbreak and lock-downs on the world economy has coursed a major disrupt in mostly all sectors. The educational sector is not exempted from this disruption as it is one of the most affected sectors in the world. Similarly, most developing countries are still struggling to adopt/ adapt with the 21st-century advancement of technology, which includes e-learning/ e-education. Furthermore, one is left to wonder of the possibility of these countries surviving this disruption on their various educational systems that may no longer be business as usual after the Covid Pandemic era. This study evaluates the e-learning process of educational systems, especially in developing countries. The collection of data for the study was effected through the use of questionnaires with sampling drawn by stratified random sampling. The data was analyzed using descriptive and inferential statistics. The findings of the study show that about 30% of developing countries have fully adopted the e-learning system, about 45% of these countries are still struggling to upgrade while about 25% of these countries are yet to adopt the e-learning system of education. The study concludes that the sudden closure of educational institutions around the world during the Covid Pandemic period should facilitate a teaching pedagogy of e-learning and virtual delivery of courses and programmes in these developing countries. If this approach can be fully adopted, schools might have to grapple with the initial teething problems, given the sudden transition just in order to preserve the welfare of students. While progress should be made to transit as the case may be, lectures and seminars can be delivered through the web conferencing site-zoom. Interestingly, this can be done on a mobile phone. The demands of this approach would equally allow lecturers to make major changes to their work habits, uploading their teaching materials online, and get to grips with what online lecturing entails. Consequently, the study recommends that leaders of developing countries, regulatory authorities, and heads of educational institutions must adopt e-learning into their educational system. Also, e-learning should be adopted into the educational curriculum of students, especially from elementary school up to tertiary level. Total compliance to the e-learning system must be ensured on the part of both the institutions, stake holders, lecturers, tutors, and students. Finally, collaborations with developed countries and effective funding for e-learning integration must form the heart of their cardinal mission.Keywords: Covid pandemic, developing countries, educational system, e-learning
Procedia PDF Downloads 10225367 Introduction of Electronic Health Records to Improve Data Quality in Emergency Department Operations
Authors: Anuruddha Jagoda, Samiddhi Samarakoon, Anil Jasinghe
Abstract:
In its simplest form, data quality can be defined as 'fitness for use' and it is a concept with multi-dimensions. Emergency Departments(ED) require information to treat patients and on the other hand it is the primary source of information regarding accidents, injuries, emergencies etc. Also, it is the starting point of various patient registries, databases and surveillance systems. This interventional study was carried out to improve data quality at the ED of the National Hospital of Sri Lanka (NHSL) by introducing an e health solution to improve data quality. The NHSL is the premier trauma care centre in Sri Lanka. The study consisted of three components. A research study was conducted to assess the quality of data in relation to selected five dimensions of data quality namely accuracy, completeness, timeliness, legibility and reliability. The intervention was to develop and deploy an electronic emergency department information system (eEDIS). Post assessment of the intervention confirmed that all five dimensions of data quality had improved. The most significant improvements are noticed in accuracy and timeliness dimensions.Keywords: electronic health records, electronic emergency department information system, emergency department, data quality
Procedia PDF Downloads 27425366 Data Presentation of Lane-Changing Events Trajectories Using HighD Dataset
Authors: Basma Khelfa, Antoine Tordeux, Ibrahima Ba
Abstract:
We present a descriptive analysis data of lane-changing events in multi-lane roads. The data are provided from The Highway Drone Dataset (HighD), which are microscopic trajectories in highway. This paper describes and analyses the role of the different parameters and their significance. Thanks to HighD data, we aim to find the most frequent reasons that motivate drivers to change lanes. We used the programming language R for the processing of these data. We analyze the involvement and relationship of different variables of each parameter of the ego vehicle and the four vehicles surrounding it, i.e., distance, speed difference, time gap, and acceleration. This was studied according to the class of the vehicle (car or truck), and according to the maneuver it undertook (overtaking or falling back).Keywords: autonomous driving, physical traffic model, prediction model, statistical learning process
Procedia PDF Downloads 26125365 Determinants of the Users Intention of Social-Local-Mobile Applications
Authors: Chia-Chen Chen, Mu-Yen Chen
Abstract:
In recent years, with the vigorous growth of hardware and software technologies of smart mobile devices coupling with the rapid increase of social network influence, mobile commerce also presents the commercial operation mode of the future mainstream. For the time being, SoLoMo has become one of the very popular commercial models, its full name and meaning mainly refer to that users can obtain three key service types through smart mobile devices (Mobile) and omnipresent network services, and then link to the social (Social) web site platform to obtain the information exchange, again collocating with position and situational awareness technology to get the service suitable for the location (Local), through anytime, anywhere and any personal use of different mobile devices to provide the service concept of seamless integration style, and more deriving infinite opportunities of the future. The study tries to explore the use intention of users with SoLoMo mobile application formula, proposing research model to integrate TAM, ISSM, IDT and network externality, and with questionnaires to collect data and analyze results to verify the hypothesis, results show that perceived ease-of-use (PEOU), perceived usefulness (PU), and network externality have significant impact on the use intention with SoLoMo mobile application formula, and the information quality, relative advantages and observability have impacts on the perceived usefulness, and further affecting the use intention.Keywords: SoLoMo (social, local, and mobile), technology acceptance model, innovation diffusion theory, network externality
Procedia PDF Downloads 52825364 Evaluation of Golden Beam Data for the Commissioning of 6 and 18 MV Photons Beams in Varian Linear Accelerator
Authors: Shoukat Ali, Abdul Qadir Jandga, Amjad Hussain
Abstract:
Objective: The main purpose of this study is to compare the Percent Depth dose (PDD) and In-plane and cross-plane profiles of Varian Golden beam data to the measured data of 6 and 18 MV photons for the commissioning of Eclipse treatment planning system. Introduction: Commissioning of treatment planning system requires an extensive acquisition of beam data for the clinical use of linear accelerators. Accurate dose delivery require to enter the PDDs, Profiles and dose rate tables for open and wedges fields into treatment planning system, enabling to calculate the MUs and dose distribution. Varian offers a generic set of beam data as a reference data, however not recommend for clinical use. In this study, we compared the generic beam data with the measured beam data to evaluate the reliability of generic beam data to be used for the clinical purpose. Methods and Material: PDDs and Profiles of Open and Wedge fields for different field sizes and at different depths measured as per Varian’s algorithm commissioning guideline. The measurement performed with PTW 3D-scanning water phantom with semi-flex ion chamber and MEPHYSTO software. The online available Varian Golden Beam Data compared with the measured data to evaluate the accuracy of the golden beam data to be used for the commissioning of Eclipse treatment planning system. Results: The deviation between measured vs. golden beam data was in the range of 2% max. In PDDs, the deviation increases more in the deeper depths than the shallower depths. Similarly, profiles have the same trend of increasing deviation at large field sizes and increasing depths. Conclusion: Study shows that the percentage deviation between measured and golden beam data is within the acceptable tolerance and therefore can be used for the commissioning process; however, verification of small subset of acquired data with the golden beam data should be mandatory before clinical use.Keywords: percent depth dose, flatness, symmetry, golden beam data
Procedia PDF Downloads 48925363 Variable-Fidelity Surrogate Modelling with Kriging
Authors: Selvakumar Ulaganathan, Ivo Couckuyt, Francesco Ferranti, Tom Dhaene, Eric Laermans
Abstract:
Variable-fidelity surrogate modelling offers an efficient way to approximate function data available in multiple degrees of accuracy each with varying computational cost. In this paper, a Kriging-based variable-fidelity surrogate modelling approach is introduced to approximate such deterministic data. Initially, individual Kriging surrogate models, which are enhanced with gradient data of different degrees of accuracy, are constructed. Then these Gradient enhanced Kriging surrogate models are strategically coupled using a recursive CoKriging formulation to provide an accurate surrogate model for the highest fidelity data. While, intuitively, gradient data is useful to enhance the accuracy of surrogate models, the primary motivation behind this work is to investigate if it is also worthwhile incorporating gradient data of varying degrees of accuracy.Keywords: Kriging, CoKriging, Surrogate modelling, Variable- fidelity modelling, Gradients
Procedia PDF Downloads 55825362 Robust Barcode Detection with Synthetic-to-Real Data Augmentation
Authors: Xiaoyan Dai, Hsieh Yisan
Abstract:
Barcode processing of captured images is a huge challenge, as different shooting conditions can result in different barcode appearances. This paper proposes a deep learning-based barcode detection using synthetic-to-real data augmentation. We first augment barcodes themselves; we then augment images containing the barcodes to generate a large variety of data that is close to the actual shooting environments. Comparisons with previous works and evaluations with our original data show that this approach achieves state-of-the-art performance in various real images. In addition, the system uses hybrid resolution for barcode “scan” and is applicable to real-time applications.Keywords: barcode detection, data augmentation, deep learning, image-based processing
Procedia PDF Downloads 16825361 Global Migration and Endangered Majorities in Europe
Authors: Liav Orgad
Abstract:
This article challenges one of the most fundamental propositions in the democratic theory that the majority culture is protected merely by the forces of democracy and thus needs no special legal protection. By describing changes in the patterns of migration to Europe, in the face of the European society, and in the world as a whole, the Article demonstrates that the majority culture is no longer automatically protected by the forces of democracy. It claims that the changing reality is not adequately addressed by political theory and human rights law and advances the promotion of a new concept—'cultural majority rights'.Keywords: European migration, European demography, democratic theory, majority rights, integration
Procedia PDF Downloads 40025360 Analysis of Delivery of Quad Play Services
Authors: Rahul Malhotra, Anurag Sharma
Abstract:
Fiber based access networks can deliver performance that can support the increasing demands for high speed connections. One of the new technologies that have emerged in recent years is Passive Optical Networks. This paper is targeted to show the simultaneous delivery of triple play service (data, voice, and video). The comparative investigation and suitability of various data rates is presented. It is demonstrated that as we increase the data rate, number of users to be accommodated decreases due to increase in bit error rate.Keywords: FTTH, quad play, play service, access networks, data rate
Procedia PDF Downloads 414