Search results for: data dissemination
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 24455

Search results for: data dissemination

24125 Noise Reduction in Web Data: A Learning Approach Based on Dynamic User Interests

Authors: Julius Onyancha, Valentina Plekhanova

Abstract:

One of the significant issues facing web users is the amount of noise in web data which hinders the process of finding useful information in relation to their dynamic interests. Current research works consider noise as any data that does not form part of the main web page and propose noise web data reduction tools which mainly focus on eliminating noise in relation to the content and layout of web data. This paper argues that not all data that form part of the main web page is of a user interest and not all noise data is actually noise to a given user. Therefore, learning of noise web data allocated to the user requests ensures not only reduction of noisiness level in a web user profile, but also a decrease in the loss of useful information hence improves the quality of a web user profile. Noise Web Data Learning (NWDL) tool/algorithm capable of learning noise web data in web user profile is proposed. The proposed work considers elimination of noise data in relation to dynamic user interest. In order to validate the performance of the proposed work, an experimental design setup is presented. The results obtained are compared with the current algorithms applied in noise web data reduction process. The experimental results show that the proposed work considers the dynamic change of user interest prior to elimination of noise data. The proposed work contributes towards improving the quality of a web user profile by reducing the amount of useful information eliminated as noise.

Keywords: web log data, web user profile, user interest, noise web data learning, machine learning

Procedia PDF Downloads 240
24124 Data Mining and Knowledge Management Application to Enhance Business Operations: An Exploratory Study

Authors: Zeba Mahmood

Abstract:

The modern business organizations are adopting technological advancement to achieve competitive edge and satisfy their consumer. The development in the field of Information technology systems has changed the way of conducting business today. Business operations today rely more on the data they obtained and this data is continuously increasing in volume. The data stored in different locations is difficult to find and use without the effective implementation of Data mining and Knowledge management techniques. Organizations who smartly identify, obtain and then convert data in useful formats for their decision making and operational improvements create additional value for their customers and enhance their operational capabilities. Marketers and Customer relationship departments of firm use Data mining techniques to make relevant decisions, this paper emphasizes on the identification of different data mining and Knowledge management techniques that are applied to different business industries. The challenges and issues of execution of these techniques are also discussed and critically analyzed in this paper.

Keywords: knowledge, knowledge management, knowledge discovery in databases, business, operational, information, data mining

Procedia PDF Downloads 509
24123 Artistic Themes in War Related Comics Contributing to the Portrayal of Sociopolitical Accounts

Authors: Rachel-Kate Bowdler

Abstract:

Wartime efforts, news, and heroic stories are important to the public in understanding the political climate, yet hard to digest. However, graphic novels are able to portray intense sociopolitical themes and reinvent the account for the public. Modern comics centered around war introduces the historical context to new audiences, thus keeping history relevant and remembered. This is a trend in graphic novels that is popular for expressing wartime and political stories. Graphic novels make historical accounts and stories easier to understand and more enjoyable to read through creative expression and stylistic choices like color, design, and personified depictions of characters. This results in the need to analyze intense wartime themes in terms of artistic style and elements contributing to the portrayal of the story. Whether directly or indirectly, comics became an outlet for discussing and portraying wars, especially following World War II. It may also be relevant that comics are influential in attitudes towards war efforts. in conducting in analysis on comic books relating to war time stories and a literature review, this paper will seek to analyze the role that comics play in the dissemination of information and feelings surrounding war efforts and attitudes.

Keywords: artistic style, comics, historical, war, art and culture, journalism and media

Procedia PDF Downloads 80
24122 Indexing and Incremental Approach Using Map Reduce Bipartite Graph (MRBG) for Mining Evolving Big Data

Authors: Adarsh Shroff

Abstract:

Big data is a collection of dataset so large and complex that it becomes difficult to process using data base management tools. To perform operations like search, analysis, visualization on big data by using data mining; which is the process of extraction of patterns or knowledge from large data set. In recent years, the data mining applications become stale and obsolete over time. Incremental processing is a promising approach to refreshing mining results. It utilizes previously saved states to avoid the expense of re-computation from scratch. This project uses i2MapReduce, an incremental processing extension to Map Reduce, the most widely used framework for mining big data. I2MapReduce performs key-value pair level incremental processing rather than task level re-computation, supports not only one-step computation but also more sophisticated iterative computation, which is widely used in data mining applications, and incorporates a set of novel techniques to reduce I/O overhead for accessing preserved fine-grain computation states. To optimize the mining results, evaluate i2MapReduce using a one-step algorithm and three iterative algorithms with diverse computation characteristics for efficient mining.

Keywords: big data, map reduce, incremental processing, iterative computation

Procedia PDF Downloads 322
24121 Analyzing Large Scale Recurrent Event Data with a Divide-And-Conquer Approach

Authors: Jerry Q. Cheng

Abstract:

Currently, in analyzing large-scale recurrent event data, there are many challenges such as memory limitations, unscalable computing time, etc. In this research, a divide-and-conquer method is proposed using parametric frailty models. Specifically, the data is randomly divided into many subsets, and the maximum likelihood estimator from each individual data set is obtained. Then a weighted method is proposed to combine these individual estimators as the final estimator. It is shown that this divide-and-conquer estimator is asymptotically equivalent to the estimator based on the full data. Simulation studies are conducted to demonstrate the performance of this proposed method. This approach is applied to a large real dataset of repeated heart failure hospitalizations.

Keywords: big data analytics, divide-and-conquer, recurrent event data, statistical computing

Procedia PDF Downloads 138
24120 Adoption of Big Data by Global Chemical Industries

Authors: Ashiff Khan, A. Seetharaman, Abhijit Dasgupta

Abstract:

The new era of big data (BD) is influencing chemical industries tremendously, providing several opportunities to reshape the way they operate and help them shift towards intelligent manufacturing. Given the availability of free software and the large amount of real-time data generated and stored in process plants, chemical industries are still in the early stages of big data adoption. The industry is just starting to realize the importance of the large amount of data it owns to make the right decisions and support its strategies. This article explores the importance of professional competencies and data science that influence BD in chemical industries to help it move towards intelligent manufacturing fast and reliable. This article utilizes a literature review and identifies potential applications in the chemical industry to move from conventional methods to a data-driven approach. The scope of this document is limited to the adoption of BD in chemical industries and the variables identified in this article. To achieve this objective, government, academia, and industry must work together to overcome all present and future challenges.

Keywords: chemical engineering, big data analytics, industrial revolution, professional competence, data science

Procedia PDF Downloads 58
24119 Secure Multiparty Computations for Privacy Preserving Classifiers

Authors: M. Sumana, K. S. Hareesha

Abstract:

Secure computations are essential while performing privacy preserving data mining. Distributed privacy preserving data mining involve two to more sites that cannot pool in their data to a third party due to the violation of law regarding the individual. Hence in order to model the private data without compromising privacy and information loss, secure multiparty computations are used. Secure computations of product, mean, variance, dot product, sigmoid function using the additive and multiplicative homomorphic property is discussed. The computations are performed on vertically partitioned data with a single site holding the class value.

Keywords: homomorphic property, secure product, secure mean and variance, secure dot product, vertically partitioned data

Procedia PDF Downloads 393
24118 Cross Project Software Fault Prediction at Design Phase

Authors: Pradeep Singh, Shrish Verma

Abstract:

Software fault prediction models are created by using the source code, processed metrics from the same or previous version of code and related fault data. Some company do not store and keep track of all artifacts which are required for software fault prediction. To construct fault prediction model for such company, the training data from the other projects can be one potential solution. The earlier we predict the fault the less cost it requires to correct. The training data consists of metrics data and related fault data at function/module level. This paper investigates fault predictions at early stage using the cross-project data focusing on the design metrics. In this study, empirical analysis is carried out to validate design metrics for cross project fault prediction. The machine learning techniques used for evaluation is Naïve Bayes. The design phase metrics of other projects can be used as initial guideline for the projects where no previous fault data is available. We analyze seven data sets from NASA Metrics Data Program which offer design as well as code metrics. Overall, the results of cross project is comparable to the within company data learning.

Keywords: software metrics, fault prediction, cross project, within project.

Procedia PDF Downloads 315
24117 Comparing Emotion Recognition from Voice and Facial Data Using Time Invariant Features

Authors: Vesna Kirandziska, Nevena Ackovska, Ana Madevska Bogdanova

Abstract:

The problem of emotion recognition is a challenging problem. It is still an open problem from the aspect of both intelligent systems and psychology. In this paper, both voice features and facial features are used for building an emotion recognition system. A Support Vector Machine classifiers are built by using raw data from video recordings. In this paper, the results obtained for the emotion recognition are given, and a discussion about the validity and the expressiveness of different emotions is presented. A comparison between the classifiers build from facial data only, voice data only and from the combination of both data is made here. The need for a better combination of the information from facial expression and voice data is argued.

Keywords: emotion recognition, facial recognition, signal processing, machine learning

Procedia PDF Downloads 293
24116 Cryptosystems in Asymmetric Cryptography for Securing Data on Cloud at Various Critical Levels

Authors: Sartaj Singh, Amar Singh, Ashok Sharma, Sandeep Kaur

Abstract:

With upcoming threats in a digital world, we need to work continuously in the area of security in all aspects, from hardware to software as well as data modelling. The rise in social media activities and hunger for data by various entities leads to cybercrime and more attack on the privacy and security of persons. Cryptography has always been employed to avoid access to important data by using many processes. Symmetric key and asymmetric key cryptography have been used for keeping data secrets at rest as well in transmission mode. Various cryptosystems have evolved from time to time to make the data more secure. In this research article, we are studying various cryptosystems in asymmetric cryptography and their application with usefulness, and much emphasis is given to Elliptic curve cryptography involving algebraic mathematics.

Keywords: cryptography, symmetric key cryptography, asymmetric key cryptography

Procedia PDF Downloads 94
24115 Data Recording for Remote Monitoring of Autonomous Vehicles

Authors: Rong-Terng Juang

Abstract:

Autonomous vehicles offer the possibility of significant benefits to social welfare. However, fully automated cars might not be going to happen in the near further. To speed the adoption of the self-driving technologies, many governments worldwide are passing laws requiring data recorders for the testing of autonomous vehicles. Currently, the self-driving vehicle, (e.g., shuttle bus) has to be monitored from a remote control center. When an autonomous vehicle encounters an unexpected driving environment, such as road construction or an obstruction, it should request assistance from a remote operator. Nevertheless, large amounts of data, including images, radar and lidar data, etc., have to be transmitted from the vehicle to the remote center. Therefore, this paper proposes a data compression method of in-vehicle networks for remote monitoring of autonomous vehicles. Firstly, the time-series data are rearranged into a multi-dimensional signal space. Upon the arrival, for controller area networks (CAN), the new data are mapped onto a time-data two-dimensional space associated with the specific CAN identity. Secondly, the data are sampled based on differential sampling. Finally, the whole set of data are encoded using existing algorithms such as Huffman, arithmetic and codebook encoding methods. To evaluate system performance, the proposed method was deployed on an in-house built autonomous vehicle. The testing results show that the amount of data can be reduced as much as 1/7 compared to the raw data.

Keywords: autonomous vehicle, data compression, remote monitoring, controller area networks (CAN), Lidar

Procedia PDF Downloads 137
24114 Different Tools and Complex Approach for Improving Phytoremediation Technology

Authors: T. Varazi, M. Pruidze, M. Kurashvili, N. Gagelidze, M. Sutton

Abstract:

The complex phytoremediation approach given in the presented work implies joint application of natural sorbents, microorganisms, natural biosurfactants and plants. The approach is based on using the natural mineral composites, microorganism strains with high detoxification abilities, plants-phytoremediators and natural biosurfactants for enhancing the uptake of intermediates of pollutants by plant roots. In this complex strategy of phytoremediation technology, the sorbent serves to uptake and trap the pollutants and thus restrain their emission in the environment. The role of microorganisms is to accomplish the first stage biodegradation of organic contaminants. This is followed by application of a phytoremediation technology through purposeful planting of selected plants. Thus, using of different tools will provide restoration of polluted environment and prevention of toxic compounds’ dissemination from hotbeds of pollution for a considerable length of time. The main idea and novelty of the carried out work is the development of a new approach for the ecological safety. The wide spectrum of contaminants: Organochlorine pesticide – DDT, heavy metal –Cu, oil hydrocarbon (hexadecane) and wax have been used in this work. The presented complex biotechnology is important from the viewpoint of prevention, providing total rehabilitation of soil. It is unique to chemical pollutants, ecologically friendly and provides the control of erosion of soils.

Keywords: bioremediation, phytoremediation, pollutants, soil contamination

Procedia PDF Downloads 276
24113 Impact of Cytokines Alone and Primed with Macrophages on Balamuthia mandrillaris Interactions with Human Brain Microvascular Endothelial Cells in vitro

Authors: Abdul Matin, Salik Nawaz, Suk-Yul Jung

Abstract:

Balamuthia mandrillaris is well known to cause fatal Balamuthia amoebic encephalitis (BAE). Amoebic transmission into the central nervous system (CNS), haematogenous spread is thought to be the prime step, followed by blood-brain barrier (BBB) dissemination. Macrophages are considered to be the foremost line of defense and present in excessive numbers during amoebic infections. The aim of the present investigation was to evaluate the effects of macrophages alone or primed with cytokines on the biological characteristics of Balamuthia in vitro. Using human brain microvascular endothelial cells (HBMEC), which constitutes the BBB, we have shown that Balamuthia demonstrated > 90% binding and > 70% cytotoxicity to host cells. However, macrophages further increased amoebic binding and Balamuthia-mediated cell cytotoxicity. Furthermore, macrophages exhibited no amoebicidal effect against Balamuthia. Zymography assay demonstrated that macrophages exhibited no inhibitory effect on proteolytic activity of Balamuthia. Overall, to our best knowledge, we have shown for the first time macrophages has no inhibitory effects on the biological properties of Balamuthia in vitro. This also strengthened the concept that how and why Balamuthia can cause infections in both immuno-competent and immuno-compromised individuals.

Keywords: Balamuthia mandrillaris, macrophages, cytokines, human brain microvascular endothelial cells, Balamuthia amoebic encephalitis

Procedia PDF Downloads 132
24112 Multimedia Data Fusion for Event Detection in Twitter by Using Dempster-Shafer Evidence Theory

Authors: Samar M. Alqhtani, Suhuai Luo, Brian Regan

Abstract:

Data fusion technology can be the best way to extract useful information from multiple sources of data. It has been widely applied in various applications. This paper presents a data fusion approach in multimedia data for event detection in twitter by using Dempster-Shafer evidence theory. The methodology applies a mining algorithm to detect the event. There are two types of data in the fusion. The first is features extracted from text by using the bag-ofwords method which is calculated using the term frequency-inverse document frequency (TF-IDF). The second is the visual features extracted by applying scale-invariant feature transform (SIFT). The Dempster - Shafer theory of evidence is applied in order to fuse the information from these two sources. Our experiments have indicated that comparing to the approaches using individual data source, the proposed data fusion approach can increase the prediction accuracy for event detection. The experimental result showed that the proposed method achieved a high accuracy of 0.97, comparing with 0.93 with texts only, and 0.86 with images only.

Keywords: data fusion, Dempster-Shafer theory, data mining, event detection

Procedia PDF Downloads 380
24111 Legal Issues of Collecting and Processing Big Health Data in the Light of European Regulation 679/2016

Authors: Ioannis Iglezakis, Theodoros D. Trokanas, Panagiota Kiortsi

Abstract:

This paper aims to explore major legal issues arising from the collection and processing of Health Big Data in the light of the new European secondary legislation for the protection of personal data of natural persons, placing emphasis on the General Data Protection Regulation 679/2016. Whether Big Health Data can be characterised as ‘personal data’ or not is really the crux of the matter. The legal ambiguity is compounded by the fact that, even though the processing of Big Health Data is premised on the de-identification of the data subject, the possibility of a combination of Big Health Data with other data circulating freely on the web or from other data files cannot be excluded. Another key point is that the application of some provisions of GPDR to Big Health Data may both absolve the data controller of his legal obligations and deprive the data subject of his rights (e.g., the right to be informed), ultimately undermining the fundamental right to the protection of personal data of natural persons. Moreover, data subject’s rights (e.g., the right not to be subject to a decision based solely on automated processing) are heavily impacted by the use of AI, algorithms, and technologies that reclaim health data for further use, resulting in sometimes ambiguous results that have a substantial impact on individuals. On the other hand, as the COVID-19 pandemic has revealed, Big Data analytics can offer crucial sources of information. In this respect, this paper identifies and systematises the legal provisions concerned, offering interpretative solutions that tackle dangers concerning data subject’s rights while embracing the opportunities that Big Health Data has to offer. In addition, particular attention is attached to the scope of ‘consent’ as a legal basis in the collection and processing of Big Health Data, as the application of data analytics in Big Health Data signals the construction of new data and subject’s profiles. Finally, the paper addresses the knotty problem of role assignment (i.e., distinguishing between controller and processor/joint controllers and joint processors) in an era of extensive Big Health data sharing. The findings are the fruit of a current research project conducted by a three-member research team at the Faculty of Law of the Aristotle University of Thessaloniki and funded by the Greek Ministry of Education and Religious Affairs.

Keywords: big health data, data subject rights, GDPR, pandemic

Procedia PDF Downloads 107
24110 Adaptive Data Approximations Codec (ADAC) for AI/ML-based Cyber-Physical Systems

Authors: Yong-Kyu Jung

Abstract:

The fast growth in information technology has led to de-mands to access/process data. CPSs heavily depend on the time of hardware/software operations and communication over the network (i.e., real-time/parallel operations in CPSs (e.g., autonomous vehicles). Since data processing is an im-portant means to overcome the issue confronting data management, reducing the gap between the technological-growth and the data-complexity and channel-bandwidth. An adaptive perpetual data approximation method is intro-duced to manage the actual entropy of the digital spectrum. An ADAC implemented as an accelerator and/or apps for servers/smart-connected devices adaptively rescales digital contents (avg.62.8%), data processing/access time/energy, encryption/decryption overheads in AI/ML applications (facial ID/recognition).

Keywords: adaptive codec, AI, ML, HPC, cyber-physical, cybersecurity

Procedia PDF Downloads 58
24109 Establishing a Communication Framework in Response to the COVID-19 Pandemic in a Tertiary Government Hospital in the Philippines

Authors: Nicole Marella G. Tan, Al Joseph R. Molina, Raisa Celine R. Rosete, Soraya Elisse E. Escandor, Blythe N. Ke, Veronica Marie E. Ramos, Apolinario Ericson B. Berberabe, Jose Jonas D. del Rosario, Regina Pascua-Berba, Eileen Liesl A. Cubillan, Winlove P. Mojica

Abstract:

Emergency risk and health communications play a vital role in any pandemic response. However, the Philippine General Hospital (PGH) lacked a system of information delivery that could effectively fulfill the hospital’s communication needs as a COVID-19 referral hospital. This study aimed to describe the establishment of a communication framework for information dissemination within a tertiary government hospital during the COVID-19 pandemic and evaluated the perceived usefulness of its outputs. This is a mixed quantitative-qualitative study with two phases. Phase 1 documented the formation and responsibilities of the Information Education Communication (IEC) Committee. Phase 2 evaluated its output and outcomes through a hospital-wide survey of 528 healthcare workers (HCWs) using a pre-tested questionnaire. In-depth explanations were obtained from five focused group discussions (FGD) amongst various HCW subgroups. Descriptive analysis was done using STATA 16 while qualitative data were synthesized thematically. Communication practices in PGH were loosely structured at the beginning of the pandemic until the establishment of the IEC Committee. The IEC Committee was well-represented by concerned stakeholders. Nine types of infographics tackled different aspects of the hospital’s health operations after thorough inputs from concerned offices. Internal and external feedback mechanisms ensured accurate infographics. Majority of the survey respondents (98.67%) perceived these as useful in their work or daily lives. FGD participants cited the relevance of infographics to their occupations, suggested improvements, and hoped that these efforts would be continued in the future. Sustainability and comprehensive reach were the main concerns in this undertaking. The PGH COVID-19 IEC framework was developed through trial and testing as there were no existing formal structures to communicate health risks and to properly direct the HCWs in the chaotic time of a pandemic. It is a continuously evolving framework which is perceived as useful by HCWs and is hoped to be sustained in the future.

Keywords: COVID-19, pandemic, health communication, infographics, social media

Procedia PDF Downloads 102
24108 Real-Time Visualization Using GPU-Accelerated Filtering of LiDAR Data

Authors: Sašo Pečnik, Borut Žalik

Abstract:

This paper presents a real-time visualization technique and filtering of classified LiDAR point clouds. The visualization is capable of displaying filtered information organized in layers by the classification attribute saved within LiDAR data sets. We explain the used data structure and data management, which enables real-time presentation of layered LiDAR data. Real-time visualization is achieved with LOD optimization based on the distance from the observer without loss of quality. The filtering process is done in two steps and is entirely executed on the GPU and implemented using programmable shaders.

Keywords: filtering, graphics, level-of-details, LiDAR, real-time visualization

Procedia PDF Downloads 278
24107 The Shadow of Terrorism in the World Tourism Industry: Impacts, Prevention and Recovery Strategies

Authors: Maria Brás

Abstract:

The main purpose of the presentation is to identify the impacts and appropriate measures to prevent potential attacks, or minimize the risk of an attack in tourist destination. Terrorism has been growing in the shadow of unpredictability, however, is possible to minimize the danger of a terrorist attack by doing the: (1) recognition; (2); evaluation; (3) avoidance; (4) threat reduction. The vulnerability of tourism industry to terrorism is an undeniable fact, and terrorists know it. They use this advantage attacking tourists for very specific reasons, such as the: (1) international coverage by the media, “if it bleeds it leads” ; (2) chances of getting different nationalities at the same place and time; (3) possibility of destroyed the economy of a destination, or destinations (“terrorism contamination effect”), through the reduction of tourist demand; (4) psychological, and social disruption based on fear of negative consequences. Security incidents, such as terrorism, include different preventive measures that can be conducted in partnership with: tourism industry (hotels, airports, tourist attractions, among others); central government; public and/or private sector; local community; and media. The recovery strategies must be based on the dissemination of positive information to the media; in creating new marketing strategies that emphasize the social and cultural values of the destination; encourage domestic tourism; get government, or state, financial support.

Keywords: terrorism, tourism, safety, security, impacts, prevention, recovery

Procedia PDF Downloads 320
24106 Estimating Destinations of Bus Passengers Using Smart Card Data

Authors: Hasik Lee, Seung-Young Kho

Abstract:

Nowadays, automatic fare collection (AFC) system is widely used in many countries. However, smart card data from many of cities does not contain alighting information which is necessary to build OD matrices. Therefore, in order to utilize smart card data, destinations of passengers should be estimated. In this paper, kernel density estimation was used to forecast probabilities of alighting stations of bus passengers and applied to smart card data in Seoul, Korea which contains boarding and alighting information. This method was also validated with actual data. In some cases, stochastic method was more accurate than deterministic method. Therefore, it is sufficiently accurate to be used to build OD matrices.

Keywords: destination estimation, Kernel density estimation, smart card data, validation

Procedia PDF Downloads 327
24105 Politics in Academia: How the Diffusion of Innovation Relates to Professional Capital

Authors: Autumn Rooms Cypres, Barbara Driver

Abstract:

The purpose of this study is to extend discussions about innovations and career politics. Research questions that grounded this effort were: How does an academic learn the unspoken rules of the academy? What happens politically to an academic’s career when their research speaks against the grain of society? Do professors perceive signals that it is time to move on to another institution or even to another career? Epistemology and Methods: This qualitative investigation was focused on examining perceptions of academics. Therefore an open-ended field study, based on Grounded Theory, was used. This naturalistic paradigm (Lincoln & Guba,1985) was selected because it tends to understand information in terms of whole, of patterns, and in relations to the context of the environment. The technique for gathering data was the process of semi-structured, in-depth interviewing. Twenty five academics across the United States were interviewed relative to their career trajectories and the politics and opportunities they have encountered in relation to their research efforts. Findings: The analysis of interviews revealed four themes: Academics are beholden to 2 specific networks of power that influence their sense of job security; the local network based on their employing university and the national network of scholars who share the same field of research. The fights over what counts as research can and does drift from the intellectual to the political, and personal. Academic were able to identify specific instances of shunning and or punishment from their colleagues related directly to the dissemination of research that spoke against the grain of the local or national networks. Academics identified specific signals from both of these networks indicating that their career was flourishing or withering. Implications: This research examined insights from those who persevered when the fights over what and who counts drifted from the intellectual to the political, and the personal. Considerations of why such drifts happen were offered in the form of a socio-political construct called Fit, which included thoughts on hegemony, discourse, and identity. This effort reveals the importance of understanding what professional capital is relative to job security. It also reveals that fear is an enmeshed and often unspoken part of the culture of Academia. Further research to triangulate these findings would be helpful within international contexts.

Keywords: politics, academia, job security, context

Procedia PDF Downloads 301
24104 Acute Asthma in Emergency Department, Prevalence of Respiratory and Non-Respiratory Symptoms

Authors: Sherif Refaat, Hassan Aref

Abstract:

Background: Although asthma is a well-identified presentation to the emergency department, little is known about the frequency and percentage of respiratory and non-respiratory symptoms in patients with acute asthma in the emergency department (ED). Objective: The aim of this study is to identify the relationship between acute asthma exacerbation and different respiratory and non-respiratory symptoms including chest pain encountered by patients visiting the emergency department. Subjects and methods: Prospective study included 169 (97 females and 72 males) asthmatic patients who were admitted to emergency department of two tertiary care facility hospitals for asthma exacerbation from the period of September 2010 to August 2013, an anonyms questionnaire was used to collect symptoms and analysis of symptoms. Results: Females were 97 (57%) of the patients, mean age was 35.6 years; dyspnea on exertion was the commonest symptom accounting for 161 (95.2%) of patients, followed by dyspnea at rest 155 (91.7%), wheezing in 152 (89.9%), chest pain was present in 82 patients (48.5%), the pain was burning in 36 (43.9%) of the total patients with chest pain. Non-respiratory symptoms were seen frequently in acute asthma in ED. Conclusions: Dyspnea was the commonest chest symptoms encountered in patients with acute asthma followed by wheezing. Chest pain in acute asthma is a common symptom and should be fully studied to exclude misdiagnosis as of cardiac origin; there is a need for a better dissemination of knowledge about this disease association with chest pain. It was also noted that other non-respiratory symptoms are frequently encountered with acute asthma in emergency department.

Keywords: asthma, emergency department, respiratory symptoms, non respiratory system

Procedia PDF Downloads 400
24103 Evaluated Nuclear Data Based Photon Induced Nuclear Reaction Model of GEANT4

Authors: Jae Won Shin

Abstract:

We develop an evaluated nuclear data based photonuclear reaction model of GEANT4 for a more accurate simulation of photon-induced neutron production. The evaluated photonuclear data libraries from the ENDF/B-VII.1 are taken as input. Incident photon energies up to 140 MeV which is the threshold energy for the pion production are considered. For checking the validity of the use of the data-based model, we calculate the photoneutron production cross-sections and yields and compared them with experimental data. The results obtained from the developed model are found to be in good agreement with the experimental data for (γ,xn) reactions.

Keywords: ENDF/B-VII.1, GEANT4, photoneutron, photonuclear reaction

Procedia PDF Downloads 252
24102 Case-Based Reasoning for Modelling Random Variables in the Reliability Assessment of Existing Structures

Authors: Francesca Marsili

Abstract:

The reliability assessment of existing structures with probabilistic methods is becoming an increasingly important and frequent engineering task. However probabilistic reliability methods are based on an exhaustive knowledge of the stochastic modeling of the variables involved in the assessment; at the moment standards for the modeling of variables are absent, representing an obstacle to the dissemination of probabilistic methods. The framework according to probability distribution functions (PDFs) are established is represented by the Bayesian statistics, which uses Bayes Theorem: a prior PDF for the considered parameter is established based on information derived from the design stage and qualitative judgments based on the engineer past experience; then, the prior model is updated with the results of investigation carried out on the considered structure, such as material testing, determination of action and structural properties. The application of Bayesian statistics arises two different kind of problems: 1. The results of the updating depend on the engineer previous experience; 2. The updating of the prior PDF can be performed only if the structure has been tested, and quantitative data that can be statistically manipulated have been collected; performing tests is always an expensive and time consuming operation; furthermore, if the considered structure is an ancient building, destructive tests could compromise its cultural value and therefore should be avoided. In order to solve those problems, an interesting research path is represented by investigating Artificial Intelligence (AI) techniques that can be useful for the automation of the modeling of variables and for the updating of material parameters without performing destructive tests. Among the others, one that raises particular attention in relation to the object of this study is constituted by Case-Based Reasoning (CBR). In this application, cases will be represented by existing buildings where material tests have already been carried out and an updated PDFs for the material mechanical parameters has been computed through a Bayesian analysis. Then each case will be composed by a qualitative description of the material under assessment and the posterior PDFs that describe its material properties. The problem that will be solved is the definition of PDFs for material parameters involved in the reliability assessment of the considered structure. A CBR system represent a good candi¬date in automating the modelling of variables because: 1. Engineers already draw an estimation of the material properties based on the experience collected during the assessment of similar structures, or based on similar cases collected in literature or in data-bases; 2. Material tests carried out on structure can be easily collected from laboratory database or from literature; 3. The system will provide the user of a reliable probabilistic description of the variables involved in the assessment that will also serve as a tool in support of the engineer’s qualitative judgments. Automated modeling of variables can help in spreading probabilistic reliability assessment of existing buildings in the common engineering practice, and target at the best intervention and further tests on the structure; CBR represents a technique which may help to achieve this.

Keywords: reliability assessment of existing buildings, Bayesian analysis, case-based reasoning, historical structures

Procedia PDF Downloads 317
24101 Optimizing Communications Overhead in Heterogeneous Distributed Data Streams

Authors: Rashi Bhalla, Russel Pears, M. Asif Naeem

Abstract:

In this 'Information Explosion Era' analyzing data 'a critical commodity' and mining knowledge from vertically distributed data stream incurs huge communication cost. However, an effort to decrease the communication in the distributed environment has an adverse influence on the classification accuracy; therefore, a research challenge lies in maintaining a balance between transmission cost and accuracy. This paper proposes a method based on Bayesian inference to reduce the communication volume in a heterogeneous distributed environment while retaining prediction accuracy. Our experimental evaluation reveals that a significant reduction in communication can be achieved across a diverse range of dataset types.

Keywords: big data, bayesian inference, distributed data stream mining, heterogeneous-distributed data

Procedia PDF Downloads 138
24100 Data Privacy: Stakeholders’ Conflicts in Medical Internet of Things

Authors: Benny Sand, Yotam Lurie, Shlomo Mark

Abstract:

Medical Internet of Things (MIoT), AI, and data privacy are linked forever in a gordian knot. This paper explores the conflicts of interests between the stakeholders regarding data privacy in the MIoT arena. While patients are at home during healthcare hospitalization, MIoT can play a significant role in improving the health of large parts of the population by providing medical teams with tools for collecting data, monitoring patients’ health parameters, and even enabling remote treatment. While the amount of data handled by MIoT devices grows exponentially, different stakeholders have conflicting understandings and concerns regarding this data. The findings of the research indicate that medical teams are not concerned by the violation of data privacy rights of the patients' in-home healthcare, while patients are more troubled and, in many cases, are unaware that their data is being used without their consent. MIoT technology is in its early phases, and hence a mixed qualitative and quantitative research approach will be used, which will include case studies and questionnaires in order to explore this issue and provide alternative solutions.

Keywords: MIoT, data privacy, stakeholders, home healthcare, information privacy, AI

Procedia PDF Downloads 77
24099 The Economic Impact of Mediation: An Analysis in Time of Crisis

Authors: C. M. Cebola, V. H. Ferreira

Abstract:

In the past decade mediation has been legally implemented in European legal systems, especially after the publication by the European Union of the Directive 2008/52/EC on certain aspects of mediation in civil and mercantile matters. Developments in international trade and globalization in this new century have led to an increase of the number of litigations, often cross-border, and the courts have failed to respond adequately. We do not advocate that mediation should be promoted as the solution for all justice problems, but as a means with its own specificities that the parties may choose to consider as the best way to resolve their disputes. Thus, the implementation of mediation should be based on the advantages of its application. From the economic point of view, competitive negotiation can generate negative external effects in social terms. A solution reached in a court of law is not always the most efficient one considering all elements of society (economic social benefit). On the other hand, the administration of justice adds in economic terms transaction costs that can be mitigated by the application of other forms of conflict resolution, such as mediation. In this paper, the economic benefits of mediation will be analysed in the light of various studies on the functioning of justice. Several theoretical arguments will be confronted with empirical studies to demonstrate that mediation has significant positive economic effects. The objective is to contribute to the dissemination of mediation between companies and citizens, but also to demonstrate the cost to governments and states of still limited use of mediation, particularly in the current economic crisis and propose actions to develop the application of mediation.

Keywords: economic impact, litigation costs, mediation, solutions

Procedia PDF Downloads 263
24098 Optimizing Data Integration and Management Strategies for Upstream Oil and Gas Operations

Authors: Deepak Singh, Rail Kuliev

Abstract:

The abstract highlights the critical importance of optimizing data integration and management strategies in the upstream oil and gas industry. With its complex and dynamic nature generating vast volumes of data, efficient data integration and management are essential for informed decision-making, cost reduction, and maximizing operational performance. Challenges such as data silos, heterogeneity, real-time data management, and data quality issues are addressed, prompting the proposal of several strategies. These strategies include implementing a centralized data repository, adopting industry-wide data standards, employing master data management (MDM), utilizing real-time data integration technologies, and ensuring data quality assurance. Training and developing the workforce, “reskilling and upskilling” the employees and establishing robust Data Management training programs play an essential role and integral part in this strategy. The article also emphasizes the significance of data governance and best practices, as well as the role of technological advancements such as big data analytics, cloud computing, Internet of Things (IoT), and artificial intelligence (AI) and machine learning (ML). To illustrate the practicality of these strategies, real-world case studies are presented, showcasing successful implementations that improve operational efficiency and decision-making. In present study, by embracing the proposed optimization strategies, leveraging technological advancements, and adhering to best practices, upstream oil and gas companies can harness the full potential of data-driven decision-making, ultimately achieving increased profitability and a competitive edge in the ever-evolving industry.

Keywords: master data management, IoT, AI&ML, cloud Computing, data optimization

Procedia PDF Downloads 43
24097 Influence of Parameters of Modeling and Data Distribution for Optimal Condition on Locally Weighted Projection Regression Method

Authors: Farhad Asadi, Mohammad Javad Mollakazemi, Aref Ghafouri

Abstract:

Recent research in neural networks science and neuroscience for modeling complex time series data and statistical learning has focused mostly on learning from high input space and signals. Local linear models are a strong choice for modeling local nonlinearity in data series. Locally weighted projection regression is a flexible and powerful algorithm for nonlinear approximation in high dimensional signal spaces. In this paper, different learning scenario of one and two dimensional data series with different distributions are investigated for simulation and further noise is inputted to data distribution for making different disordered distribution in time series data and for evaluation of algorithm in locality prediction of nonlinearity. Then, the performance of this algorithm is simulated and also when the distribution of data is high or when the number of data is less the sensitivity of this approach to data distribution and influence of important parameter of local validity in this algorithm with different data distribution is explained.

Keywords: local nonlinear estimation, LWPR algorithm, online training method, locally weighted projection regression method

Procedia PDF Downloads 471
24096 Big Data Strategy for Telco: Network Transformation

Authors: F. Amin, S. Feizi

Abstract:

Big data has the potential to improve the quality of services; enable infrastructure that businesses depend on to adapt continually and efficiently; improve the performance of employees; help organizations better understand customers; and reduce liability risks. Analytics and marketing models of fixed and mobile operators are falling short in combating churn and declining revenue per user. Big Data presents new method to reverse the way and improve profitability. The benefits of Big Data and next-generation network, however, are more exorbitant than improved customer relationship management. Next generation of networks are in a prime position to monetize rich supplies of customer information—while being mindful of legal and privacy issues. As data assets are transformed into new revenue streams will become integral to high performance.

Keywords: big data, next generation networks, network transformation, strategy

Procedia PDF Downloads 336