Search results for: global analytics
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5318

Search results for: global analytics

5228 Scaling Siamese Neural Network for Cross-Domain Few Shot Learning in Medical Imaging

Authors: Jinan Fiaidhi, Sabah Mohammed

Abstract:

Cross-domain learning in the medical field is a research challenge as many conditions, like in oncology imaging, use different imaging modalities. Moreover, in most of the medical learning applications, the sample training size is relatively small. Although few-shot learning (FSL) through the use of a Siamese neural network was able to be trained on a small sample with remarkable accuracy, FSL fails to be effective for use in multiple domains as their convolution weights are set for task-specific applications. In this paper, we are addressing this problem by enabling FSL to possess the ability to shift across domains by designing a two-layer FSL network that can learn individually from each domain and produce a shared features map with extra modulation to be used at the second layer that can recognize important targets from mix domains. Our initial experimentations based on mixed medical datasets like the Medical-MNIST reveal promising results. We aim to continue this research to perform full-scale analytics for testing our cross-domain FSL learning.

Keywords: Siamese neural network, few-shot learning, meta-learning, metric-based learning, thick data transformation and analytics

Procedia PDF Downloads 10
5227 Location-Domination on Join of Two Graphs and Their Complements

Authors: Analen Malnegro, Gina Malacas

Abstract:

Dominating sets and related topics have been studied extensively in the past few decades. A dominating set of a graph G is a subset D of V such that every vertex not in D is adjacent to at least one member of D. The domination number γ(G) is the number of vertices in a smallest dominating set for G. Some problems involving detection devices can be modeled with graphs. Finding the minimum number of devices needed according to the type of devices and the necessity of locating the object gives rise to locating-dominating sets. A subset S of vertices of a graph G is called locating-dominating set, LD-set for short, if it is a dominating set and if every vertex v not in S is uniquely determined by the set of neighbors of v belonging to S. The location-domination number λ(G) is the minimum cardinality of an LD-set for G. The complement of a graph G is a graph Ḡ on same vertices such that two distinct vertices of Ḡ are adjacent if and only if they are not adjacent in G. An LD-set of a graph G is global if it is an LD-set of both G and its complement Ḡ. The global location-domination number λg(G) is defined as the minimum cardinality of a global LD-set of G. In this paper, global LD-sets on the join of two graphs are characterized. Global location-domination numbers of these graphs are also determined.

Keywords: dominating set, global locating-dominating set, global location-domination number, locating-dominating set, location-domination number

Procedia PDF Downloads 156
5226 The Impact of Global Financial Crises and Corporate Financial Crisis (Bankruptcy Risk) on Corporate Tax Evasion: Evidence from Emerging Markets

Authors: Seyed Sajjad Habibi

Abstract:

The aim of this study is to investigate the impact of global financial crises and corporate financial crisis on tax evasion of companies listed on the Tehran Stock Exchange. For this purpose, panel data in the periods of financial crisis period (2007 to 2012) and without a financial crisis (2004, 2005, 2006, 2013, 2014, and 2015) was analyzed using multivariate linear regression. The results indicate a significant relationship between the corporate financial crisis (bankruptcy risk) and tax evasion in the global financial crisis period. The results also showed a significant relationship between the corporate bankruptcy risk and tax evasion in the period with no global financial crisis. A significant difference was found between the bankruptcy risk and tax evasion in the period of the global financial crisis and that with no financial crisis so that tax evasion increased in the financial crisis period.

Keywords: global financial crisis, corporate financial crisis, bankruptcy risk, tax evasion risk, emerging markets

Procedia PDF Downloads 251
5225 Semantic Based Analysis in Complaint Management System with Analytics

Authors: Francis Alterado, Jennifer Enriquez

Abstract:

Semantic Based Analysis in Complaint Management System with Analytics is an enhanced tool of providing complaints by the clients as well as a mechanism for Palawan Polytechnic College to gather, process, and monitor status of these complaints. The study has a mobile application that serves as a remote facility of communication between the students and the school management on the issues encountered by the student and the solution of every complaint received. In processing the complaints, text mining and clustering algorithms were utilized. Every module of the systems was tested and based on the results; these are 100% free from error before integration was done. A system testing was also done by checking the expected functionality of the system which was 100% functional. The system was tested by 10 students by forwarding complaints to 10 departments. Based on results, the students were able to submit complaints, the system was able to process accordingly by identifying to which department the complaints are intended, and the concerned department was able to give feedback on the complaint received to the student. With this, the system gained 4.7 rating which means Excellent.

Keywords: technology adoption, emerging technology, issues challenges, algorithm, text mining, mobile technology

Procedia PDF Downloads 172
5224 Aggregate Fluctuations and the Global Network of Input-Output Linkages

Authors: Alexander Hempfing

Abstract:

The desire to understand business cycle fluctuations, trade interdependencies and co-movement has a long tradition in economic thinking. From input-output economics to business cycle theory, researchers aimed to find appropriate answers from an empirical as well as a theoretical perspective. This paper empirically analyses how the production structure of the global economy and several states developed over time, what their distributional properties are and if there are network specific metrics that allow identifying structurally important nodes, on a global, national and sectoral scale. For this, the World Input-Output Database was used, and different statistical methods were applied. Empirical evidence is provided that the importance of the Eastern hemisphere in the global production network has increased significantly between 2000 and 2014. Moreover, it was possible to show that the sectoral eigenvector centrality indices on a global level are power-law distributed, providing evidence that specific national sectors exist which are more critical to the world economy than others while serving as a hub within the global production network. However, further findings suggest, that global production cannot be characterized as a scale-free network.

Keywords: economic integration, industrial organization, input-output economics, network economics, production networks

Procedia PDF Downloads 241
5223 A Survey of Key Challenges of Adopting Agile in Global Software Development: A Case Study with Malaysia Perspective

Authors: Amna Batool

Abstract:

Agile methodology is the current most popular technique in software development projects. Agile methods in software development bring optimistic impact on software performances, quality and customer satisfaction. There are some organizations and small-medium enterprises adopting agile into their local software development projects as well as in distributed software development projects. Adopting agile methods in local software development projects is valuable. However, agile global software deployment needs an attention. There are different key challenges in agile global software development that need to resolve and enhance the global software development cycles. The proposed systematic literature review investigates all key challenges of agile in global software development. Moreover, a quantitative methodology (an actual survey) targeted to present a real case scenario of these particular key challenges faced by one of the software houses that is BestWeb Malaysia. The outcomes of systematic literature and the results of quantitative methodology are compared with each other to evaluate if the key challenges pointed out in systematic review still exist. The proposed research and its exploratory results can assist small medium enterprises to avoid these challenges by adopting the best practices in their global software development projects. Moreover, it is helpful for novice researchers to get valuable information altogether.

Keywords: agile software development, ASD challenges, agile global software development, challenges in agile global software development

Procedia PDF Downloads 130
5222 Detecting Elderly Abuse in US Nursing Homes Using Machine Learning and Text Analytics

Authors: Minh Huynh, Aaron Heuser, Luke Patterson, Chris Zhang, Mason Miller, Daniel Wang, Sandeep Shetty, Mike Trinh, Abigail Miller, Adaeze Enekwechi, Tenille Daniels, Lu Huynh

Abstract:

Machine learning and text analytics have been used to analyze child abuse, cyberbullying, domestic abuse and domestic violence, and hate speech. However, to the authors’ knowledge, no research to date has used these methods to study elder abuse in nursing homes or skilled nursing facilities from field inspection reports. We used machine learning and text analytics methods to analyze 356,000 inspection reports, which have been extracted from CMS Form-2567 field inspections of US nursing homes and skilled nursing facilities between 2016 and 2021. Our algorithm detected occurrences of the various types of abuse, including physical abuse, psychological abuse, verbal abuse, sexual abuse, and passive and active neglect. For example, to detect physical abuse, our algorithms search for combinations or phrases and words suggesting willful infliction of damage (hitting, pinching or burning, tethering, tying), or consciously ignoring an emergency. To detect occurrences of elder neglect, our algorithm looks for combinations or phrases and words suggesting both passive neglect (neglecting vital needs, allowing malnutrition and dehydration, allowing decubiti, deprivation of information, limitation of freedom, negligence toward safety precautions) and active neglect (intimidation and name-calling, tying the victim up to prevent falls without consent, consciously ignoring an emergency, not calling a physician in spite of indication, stopping important treatments, failure to provide essential care, deprivation of nourishment, leaving a person alone for an inappropriate amount of time, excessive demands in a situation of care). We further compare the prevalence of abuse before and after Covid-19 related restrictions on nursing home visits. We also identified the facilities with the most number of cases of abuse with no abuse facilities within a 25-mile radius as most likely candidates for additional inspections. We also built an interactive display to visualize the location of these facilities.

Keywords: machine learning, text analytics, elder abuse, elder neglect, nursing home abuse

Procedia PDF Downloads 118
5221 Evaluating the Total Costs of a Ransomware-Resilient Architecture for Healthcare Systems

Authors: Sreejith Gopinath, Aspen Olmsted

Abstract:

This paper is based on our previous work that proposed a risk-transference-based architecture for healthcare systems to store sensitive data outside the system boundary, rendering the system unattractive to would-be bad actors. This architecture also allows a compromised system to be abandoned and a new system instance spun up in place to ensure business continuity without paying a ransom or engaging with a bad actor. This paper delves into the details of various attacks we simulated against the prototype system. In the paper, we discuss at length the time and computational costs associated with storing and retrieving data in the prototype system, abandoning a compromised system, and setting up a new instance with existing data. Lastly, we simulate some analytical workloads over the data stored in our specialized data storage system and discuss the time and computational costs associated with running analytics over data in a specialized storage system outside the system boundary. In summary, this paper discusses the total costs of data storage, access, and analytics incurred with the proposed architecture.

Keywords: cybersecurity, healthcare, ransomware, resilience, risk transference

Procedia PDF Downloads 111
5220 Models to Estimate Monthly Mean Daily Global Solar Radiation on a Horizontal Surface in Alexandria

Authors: Ahmed R. Abdelaziz, Zaki M. I. Osha

Abstract:

Solar radiation data are of great significance for solar energy system design. This study aims at developing and calibrating new empirical models for estimating monthly mean daily global solar radiation on a horizontal surface in Alexandria, Egypt. Day length hours, sun height, day number, and declination angle calculated data are used for this purpose. A comparison between measured and calculated values of solar radiation is carried out. It is shown that all the proposed correlations are able to predict the global solar radiation with excellent accuracy in Alexandria.

Keywords: solar energy, global solar radiation, model, regression coefficient

Procedia PDF Downloads 369
5219 Systematic Mapping Study of Digitization and Analysis of Manufacturing Data

Authors: R. Clancy, M. Ahern, D. O’Sullivan, K. Bruton

Abstract:

The manufacturing industry is currently undergoing a digital transformation as part of the mega-trend Industry 4.0. As part of this phase of the industrial revolution, traditional manufacturing processes are being combined with digital technologies to achieve smarter and more efficient production. To successfully digitally transform a manufacturing facility, the processes must first be digitized. This is the conversion of information from an analogue format to a digital format. The objective of this study was to explore the research area of digitizing manufacturing data as part of the worldwide paradigm, Industry 4.0. The formal methodology of a systematic mapping study was utilized to capture a representative sample of the research area and assess its current state. Specific research questions were defined to assess the key benefits and limitations associated with the digitization of manufacturing data. Research papers were classified according to the type of research and type of contribution to the research area. Upon analyzing 54 papers identified in this area, it was noted that 23 of the papers originated in Germany. This is an unsurprising finding as Industry 4.0 is originally a German strategy with supporting strong policy instruments being utilized in Germany to support its implementation. It was also found that the Fraunhofer Institute for Mechatronic Systems Design, in collaboration with the University of Paderborn in Germany, was the most frequent contributing Institution of the research papers with three papers published. The literature suggested future research directions and highlighted one specific gap in the area. There exists an unresolved gap between the data science experts and the manufacturing process experts in the industry. The data analytics expertise is not useful unless the manufacturing process information is utilized. A legitimate understanding of the data is crucial to perform accurate analytics and gain true, valuable insights into the manufacturing process. There lies a gap between the manufacturing operations and the information technology/data analytics departments within enterprises, which was borne out by the results of many of the case studies reviewed as part of this work. To test the concept of this gap existing, the researcher initiated an industrial case study in which they embedded themselves between the subject matter expert of the manufacturing process and the data scientist. Of the papers resulting from the systematic mapping study, 12 of the papers contributed a framework, another 12 of the papers were based on a case study, and 11 of the papers focused on theory. However, there were only three papers that contributed a methodology. This provides further evidence for the need for an industry-focused methodology for digitizing and analyzing manufacturing data, which will be developed in future research.

Keywords: analytics, digitization, industry 4.0, manufacturing

Procedia PDF Downloads 82
5218 Climate Change, Global Warming and Future of Our Planet

Authors: Indu Gupta

Abstract:

Climate change and global warming is most burning issue for “our common future”. For this common global interest. Countries organize conferences of government and nongovernment type. Human being destroying the non-renewable resources and polluting the renewable resources of planet for economic growth. Air pollution is mainly responsible for global warming and climate change .Due to global warming ice glaciers are shrinking and melting. Forests are shrinking, deserts expanding and soil eroding. The depletion of stratospheric ozone layer is depleting and hole in ozone layer that protect us from harmful ultra violet radiation. Extreme high temperature in summer and extreme low temperature and smog in winters, floods in rainy season. These all are indication of climate change. The level of carbon dioxide and other heat trapping gases in the atmosphere is increasing at high speed. Nation’s are worried about environmental degradation.

Keywords: environmental degradation, global warming, soil eroding, ultra-Violate radiation

Procedia PDF Downloads 339
5217 The Nexus between Climate Change and Criminality: The Nigerian Experience

Authors: Dagaci Aliyu Manbe, Anthony Abah Ebonyi

Abstract:

The increase in global temperatures is worsened by frequent natural events and human activities. Climate change has taken a prominent space in the global discourse on crime and criminality. Compared to when the subject centred around the discussion on the depletion of the ozone layer and global warming, today, the narrative revolves around the implications of changes in weather and climatic conditions in relations to violent crimes or conflict that traverse vast social, economic, and political spaces in different countries. Global warming and climate change refer to an increase in average global temperatures in the Earth’s near-surface air and oceans, which occurs due to human activities such as deforestation and the burning of fossil fuel such as gas flaring. The trend is projected to continue, if unchecked. This paper seeks to explore the nexus between climate change and criminality in Nigeria. It further examines the main ecological changes that predispose conflict dynamics of security threats factored by climate change to peaceful co-existence in Nigeria. It concludes with some recommendations on the way forward.

Keywords: conflict, climate change, criminality, global warning, peace

Procedia PDF Downloads 141
5216 A Case Study: Teachers Education Program in a Global Context

Authors: In Hoi Lee, Seong Baeg Kim, Je Eung Jeon, Gwang Yong Choi, Joo Sub Lee, Ik Sang Kim

Abstract:

Recently, the interest of globalization in the field of teacher education has increased. In the U.S., the government is trying to enhance the quality of education through a global approach in education. To do so, the schools in the U.S. are recruiting teachers with global capability from countries like Korea where competent teachers are being trained. Meanwhile, in the case of Korea, although excellent teachers have been cultivated every year, due to a low birth rate it is not easy to become a domestic teacher. To solve the trouble that the two countries are facing, the study first examines the demand and necessity of globalization in the field of teacher education between Korea and the U.S. Second, we propose a new project, called the ‘Global Teachers University (GTU)’ program to satisfy the demands of both countries. Finally, we provide its implications to build the future educational cooperation for teacher training in a global context.

Keywords: educational cooperation, globalization, teachers education program, teacher training institutions

Procedia PDF Downloads 468
5215 Exploration of RFID in Healthcare: A Data Mining Approach

Authors: Shilpa Balan

Abstract:

Radio Frequency Identification, also popularly known as RFID is used to automatically identify and track tags attached to items. This study focuses on the application of RFID in healthcare. The adoption of RFID in healthcare is a crucial technology to patient safety and inventory management. Data from RFID tags are used to identify the locations of patients and inventory in real time. Medical errors are thought to be a prominent cause of loss of life and injury. The major advantage of RFID application in healthcare industry is the reduction of medical errors. The healthcare industry has generated huge amounts of data. By discovering patterns and trends within the data, big data analytics can help improve patient care and lower healthcare costs. The number of increasing research publications leading to innovations in RFID applications shows the importance of this technology. This study explores the current state of research of RFID in healthcare using a text mining approach. No study has been performed yet on examining the current state of RFID research in healthcare using a data mining approach. In this study, related articles were collected on RFID from healthcare journal and news articles. Articles collected were from the year 2000 to 2015. Significant keywords on the topic of focus are identified and analyzed using open source data analytics software such as Rapid Miner. These analytical tools help extract pertinent information from massive volumes of data. It is seen that the main benefits of adopting RFID technology in healthcare include tracking medicines and equipment, upholding patient safety, and security improvement. The real-time tracking features of RFID allows for enhanced supply chain management. By productively using big data, healthcare organizations can gain significant benefits. Big data analytics in healthcare enables improved decisions by extracting insights from large volumes of data.

Keywords: RFID, data mining, data analysis, healthcare

Procedia PDF Downloads 197
5214 Data Mining in Healthcare for Predictive Analytics

Authors: Ruzanna Muradyan

Abstract:

Medical data mining is a crucial field in contemporary healthcare that offers cutting-edge tactics with enormous potential to transform patient care. This abstract examines how sophisticated data mining techniques could transform the healthcare industry, with a special focus on how they might improve patient outcomes. Healthcare data repositories have dynamically evolved, producing a rich tapestry of different, multi-dimensional information that includes genetic profiles, lifestyle markers, electronic health records, and more. By utilizing data mining techniques inside this vast library, a variety of prospects for precision medicine, predictive analytics, and insight production become visible. Predictive modeling for illness prediction, risk stratification, and therapy efficacy evaluations are important points of focus. Healthcare providers may use this abundance of data to tailor treatment plans, identify high-risk patient populations, and forecast disease trajectories by applying machine learning algorithms and predictive analytics. Better patient outcomes, more efficient use of resources, and early treatments are made possible by this proactive strategy. Furthermore, data mining techniques act as catalysts to reveal complex relationships between apparently unrelated data pieces, providing enhanced insights into the cause of disease, genetic susceptibilities, and environmental factors. Healthcare practitioners can get practical insights that guide disease prevention, customized patient counseling, and focused therapies by analyzing these associations. The abstract explores the problems and ethical issues that come with using data mining techniques in the healthcare industry. In order to properly use these approaches, it is essential to find a balance between data privacy, security issues, and the interpretability of complex models. Finally, this abstract demonstrates the revolutionary power of modern data mining methodologies in transforming the healthcare sector. Healthcare practitioners and researchers can uncover unique insights, enhance clinical decision-making, and ultimately elevate patient care to unprecedented levels of precision and efficacy by employing cutting-edge methodologies.

Keywords: data mining, healthcare, patient care, predictive analytics, precision medicine, electronic health records, machine learning, predictive modeling, disease prognosis, risk stratification, treatment efficacy, genetic profiles, precision health

Procedia PDF Downloads 31
5213 Competitive DNA Calibrators as Quality Reference Standards (QRS™) for Germline and Somatic Copy Number Variations/Variant Allelic Frequencies Analyses

Authors: Eirini Konstanta, Cedric Gouedard, Aggeliki Delimitsou, Stefania Patera, Samuel Murray

Abstract:

Introduction: Quality reference DNA standards (QRS) for molecular testing by next-generation sequencing (NGS) are essential for accurate quantitation of copy number variations (CNV) for germline and variant allelic frequencies (VAF) for somatic analyses. Objectives: Presently, several molecular analytics for oncology patients are reliant upon quantitative metrics. Test validation and standardisation are also reliant upon the availability of surrogate control materials allowing for understanding test LOD (limit of detection), sensitivity, specificity. We have developed a dual calibration platform allowing for QRS pairs to be included in analysed DNA samples, allowing for accurate quantitation of CNV and VAF metrics within and between patient samples. Methods: QRS™ blocks up to 500nt were designed for common NGS panel targets incorporating ≥ 2 identification tags (IDTDNA.com). These were analysed upon spiking into gDNA, somatic, and ctDNA using a proprietary CalSuite™ platform adaptable to common LIMS. Results: We demonstrate QRS™ calibration reproducibility spiked to 5–25% at ± 2.5% in gDNA and ctDNA. Furthermore, we demonstrate CNV and VAF within and between samples (gDNA and ctDNA) with the same reproducibility (± 2.5%) in a clinical sample of lung cancer and HBOC (EGFR and BRCA1, respectively). CNV analytics was performed with similar accuracy using a single pair of QRS calibrators when using multiple single targeted sequencing controls. Conclusion: Dual paired QRS™ calibrators allow for accurate and reproducible quantitative analyses of CNV, VAF, intrinsic sample allele measurement, inter and intra-sample measure not only simplifying NGS analytics but allowing for monitoring clinically relevant biomarker VAF across patient ctDNA samples with improved accuracy.

Keywords: calibrator, CNV, gene copy number, VAF

Procedia PDF Downloads 124
5212 Thick Data Analytics for Learning Cataract Severity: A Triplet Loss Siamese Neural Network Model

Authors: Jinan Fiaidhi, Sabah Mohammed

Abstract:

Diagnosing cataract severity is an important factor in deciding to undertake surgery. It is usually conducted by an ophthalmologist or through taking a variety of fundus photography that needs to be examined by the ophthalmologist. This paper carries out an investigation using a Siamese neural net that can be trained with small anchor samples to score cataract severity. The model used in this paper is based on a triplet loss function that takes the ophthalmologist best experience in rating positive and negative anchors to a specific cataract scaling system. This approach that takes the heuristics of the ophthalmologist is generally called the thick data approach, which is a kind of machine learning approach that learn from a few shots. Clinical Relevance: The lens of the eye is mostly made up of water and proteins. A cataract occurs when these proteins at the eye lens start to clump together and block lights causing impair vision. This research aims at employing thick data machine learning techniques to rate the severity of the cataract using Siamese neural network.

Keywords: thick data analytics, siamese neural network, triplet-loss model, few shot learning

Procedia PDF Downloads 68
5211 Advancing in Cricket Analytics: Novel Approaches for Pitch and Ball Detection Employing OpenCV and YOLOV8

Authors: Pratham Madnur, Prathamkumar Shetty, Sneha Varur, Gouri Parashetti

Abstract:

In order to overcome conventional obstacles, this research paper investigates novel approaches for cricket pitch and ball detection that make use of cutting-edge technologies. The research integrates OpenCV for pitch inspection and modifies the YOLOv8 model for cricket ball detection in order to overcome the shortcomings of manual pitch assessment and traditional ball detection techniques. To ensure flexibility in a range of pitch environments, the pitch detection method leverages OpenCV’s color space transformation, contour extraction, and accurate color range defining features. Regarding ball detection, the YOLOv8 model emphasizes the preservation of minor object details to improve accuracy and is specifically trained to the unique properties of cricket balls. The methods are more reliable because of the careful preparation of the datasets, which include novel ball and pitch information. These cutting-edge methods not only improve cricket analytics but also set the stage for flexible methods in more general sports technology applications.

Keywords: OpenCV, YOLOv8, cricket, custom dataset, computer vision, sports

Procedia PDF Downloads 37
5210 Analysis of Global Social Responsibilities of Social Studies Pre-Service Teachers Based on Several Variables

Authors: Zafer Cakmak, Birol Bulut, Cengiz Taskiran

Abstract:

Technological advances, the world becoming smaller and increasing world population increase our interdependence with individuals that we maybe never meet face to face. It is impossible for the modern individuals to escape global developments and their impact. Furthermore, it is very unlikely for the global societies to turn back from the path they are in. These effects of globalization in fact encumber the humankind at a certain extend. We succumb to these responsibilities for we desire a better future, a habitable world and a more peaceful life. In the present study, global responsibility levels of the participants were measured and the significance of global reactions that individuals have to develop on global issues was reinterpreted under the light of the existing literature. The study was conducted with general survey model, one of the survey methodologies General survey models are surveys conducted on the whole universe or a group, sample or sampling taken from the universe to arrive at a conclusion about the universe, which includes a high number of elements. The study was conducted with data obtained from 350 pre-service teachers attending 2016 spring semester to determine 'Global Social Responsibility' levels of social studies pre-service teachers based on several variables. Collected data were analyzed using SPSS 21.0 software. T-test and ANOVA were utilized in the data analysis.

Keywords: social studies, globalization, global social responsibility, education

Procedia PDF Downloads 370
5209 Impacts on Atmospheric Mercury from Changes in Climate, Land Use, Land Cover, and Wildfires

Authors: Shiliang Wu, Huanxin Zhang, Aditya Kumar

Abstract:

There have been increasing concerns on atmospheric mercury as a toxic and bioaccumulative pollutant in the global environment. Global change, including changes in climate change, land use, land cover and wildfires activities can all have significant impacts on atmospheric mercury. In this study, we use a global chemical transport model (GEOS-Chem) to examine the potential impacts from global change on atmospheric mercury. All of these factors in the context of global change are found to have significant impacts on the long-term evolution of atmospheric mercury and can substantially alter the global source-receptor relationships for mercury. We also estimate the global Hg emissions from wildfires for present-day and the potential impacts from the 2000-2050 changes in climate, land use and land cover and Hg anthropogenic emissions by combining statistical analysis with global data on vegetation type and coverage as well as fire activities. Present global Hg wildfire emissions are estimated to be 612 Mg year-1. Africa is the dominant source region (43.8% of global emissions), followed by Eurasia (31%) and South America (16.6%). We find significant perturbations to wildfire emissions of Hg in the context of global change, driven by the projected changes in climate, land use and land cover and Hg anthropogenic emissions. 2000-2050 climate change could increase Hg emissions by 14% globally. Projected changes in land use by 2050 could decrease the global Hg emissions from wildfires by 13% mainly driven by a decline in African emissions due to significant agricultural land expansion. Future land cover changes could lead to significant increases in Hg emissions over some regions (+32% North America, +14% Africa, +13% Eurasia). Potential enrichment of terrestrial ecosystems in 2050 in response to changes in Hg anthropogenic emissions could increase Hg wildfire emissions both globally (+28%) and regionally. Our results indicate that the future evolution of climate, land use and land cover and Hg anthropogenic emissions are all important factors affecting Hg wildfire emissions in the coming decades.

Keywords: climate change, land use, land cover, wildfires

Procedia PDF Downloads 297
5208 Adoption of Big Data by Global Chemical Industries

Authors: Ashiff Khan, A. Seetharaman, Abhijit Dasgupta

Abstract:

The new era of big data (BD) is influencing chemical industries tremendously, providing several opportunities to reshape the way they operate and help them shift towards intelligent manufacturing. Given the availability of free software and the large amount of real-time data generated and stored in process plants, chemical industries are still in the early stages of big data adoption. The industry is just starting to realize the importance of the large amount of data it owns to make the right decisions and support its strategies. This article explores the importance of professional competencies and data science that influence BD in chemical industries to help it move towards intelligent manufacturing fast and reliable. This article utilizes a literature review and identifies potential applications in the chemical industry to move from conventional methods to a data-driven approach. The scope of this document is limited to the adoption of BD in chemical industries and the variables identified in this article. To achieve this objective, government, academia, and industry must work together to overcome all present and future challenges.

Keywords: chemical engineering, big data analytics, industrial revolution, professional competence, data science

Procedia PDF Downloads 55
5207 Studying the Spatial Aspects of Visual Attention Processing in Global Precedence Paradigm

Authors: Shreya Borthakur, Aastha Vartak

Abstract:

This behavioral experiment aimed to investigate the global precedence phenomenon in a South Asian sample and its correlation with mobile screen time. The global precedence effect refers to the tendency to process overall structure before attending to specific details. Participants completed attention tasks involving global and local stimuli with varying consistencies. The results showed a tendency towards local precedence, but no significant differences in reaction times were found between consistency levels or attention conditions. However, the correlation analysis revealed that participants with higher screen time exhibited a stronger negative correlation with local attention, suggesting that excessive screen usage may impact perceptual organization. Further research is needed to explore this relationship and understand the influence of screen time on cognitive processing.

Keywords: global precedence, visual attention, perceptual organization, screen time, cognition

Procedia PDF Downloads 38
5206 Shifting of Global Energy Security: A Comparative Analysis of Indonesia and China’s Renewable Energy Policies

Authors: Widhi Hanantyo Suryadinata

Abstract:

Efforts undertaken by Indonesia and China to shift the strategies and security of renewable energy on a global stage involve approaches through policy construction related to rare minerals processing or value-adding in Indonesia and manufacturing policies through the New Energy Vehicles (NEVs) policy in China. Both policies encompass several practical regulations and policies that can be utilized for the implementation of Indonesia and China's grand efforts and ideas. Policy development in Indonesia and China can be analyzed using a comparative analysis method, as well as employing a pyramid illustration to identify policy construction phases based on the real conditions of the domestic market and implemented policies. This approach also helps to identify the potential integration of policies needed to enhance the policy development phase of a country within the pyramid. It also emphasizes the significance of integration policy to redefine renewable energy strategy and security on the global stage.

Keywords: global renewable energy security, global energy security, policy development, comparative analysis, shifting of global energy security, Indonesia, China

Procedia PDF Downloads 32
5205 Global Learning Supports Global Readiness with Projects with Purpose

Authors: Brian Bilich

Abstract:

A typical global learning program is a two-week project based, culturally immersive and academically relevant experience built around a project with purpose and catered to student and business groups. Global Learning in Continuing Education at Austin Community College promotes global readiness through projects with purpose with special attention given to balancing learning, hospitality and travel. A recent project involved CommunityFirst! Village; a 51-acre planned community which provides affordable, permanent housing for men and women coming out of chronic homelessness. Global Learning students collaborated with residents and staff at the Community First! Village on a project to produce two-dimensional remodeling plans of residents’ tiny homes with a focus on but not limited to design improvements on elements related to accessibility, increased usability of living and storage space and esthetic upgrades to boost psychological and emotional appeal. The goal of project-based learning in the context of global learning in Continuing Educaiton at Austin Community Collegen general is two fold. One, in rapid fashion we develop a project which gives the learner a hands-on opportunity to exercise soft and technical skills, like creativity and communication and analytical thinking. Two, by basing projects on global social conflict issues, the project of purpose promotes the development of empathy for other people and fosters a sense of corporate social responsibility in future generations of business leadership. In the example provide above the project informed the student group on the topic of chronic homelessness and promoted awareness and empathy for this underserved segment of the community. Project-based global learning based on projects with purpose has the potential to cultivate global readiness by developing empathy and strengthening emotional intelligence for future generations.

Keywords: project-based learning, global learning, global readiness, globalization, international exchange, collaboration

Procedia PDF Downloads 30
5204 Intrusion Detection Based on Graph Oriented Big Data Analytics

Authors: Ahlem Abid, Farah Jemili

Abstract:

Intrusion detection has been the subject of numerous studies in industry and academia, but cyber security analysts always want greater precision and global threat analysis to secure their systems in cyberspace. To improve intrusion detection system, the visualisation of the security events in form of graphs and diagrams is important to improve the accuracy of alerts. In this paper, we propose an approach of an IDS based on cloud computing, big data technique and using a machine learning graph algorithm which can detect in real time different attacks as early as possible. We use the MAWILab intrusion detection dataset . We choose Microsoft Azure as a unified cloud environment to load our dataset on. We implement the k2 algorithm which is a graphical machine learning algorithm to classify attacks. Our system showed a good performance due to the graphical machine learning algorithm and spark structured streaming engine.

Keywords: Apache Spark Streaming, Graph, Intrusion detection, k2 algorithm, Machine Learning, MAWILab, Microsoft Azure Cloud

Procedia PDF Downloads 115
5203 FreGsd: A Framework for Golbal Software Requirement Engineering

Authors: Alsahli Abdulaziz Abdullah, Hameed Ullah Khan

Abstract:

Software development nowadays is more and more using global ways of development instead of normal development enviroment where development occur in one location. This paper is a aimed to propose a Requirement Engineering framework to support Global Software Development environment with regards to all requirment engineering activities from elicitation to fially magning requirment change. Global software enviroment is more and more gaining better reputation in software developmet with better quality is resulting from developing in this eviroment yet with lower cost.However, failure rate developing in this enviroment is high due to inapproprate requirment development and managment.This paper will add to the software engineering development envrioments discipline and many developers in GSD will benefit from it.

Keywords: global software development environment, GSD, requirement engineering, FreGsd, computer engineering

Procedia PDF Downloads 507
5202 Iot Device Cost Effective Storage Architecture and Real-Time Data Analysis/Data Privacy Framework

Authors: Femi Elegbeleye, Omobayo Esan, Muienge Mbodila, Patrick Bowe

Abstract:

This paper focused on cost effective storage architecture using fog and cloud data storage gateway and presented the design of the framework for the data privacy model and data analytics framework on a real-time analysis when using machine learning method. The paper began with the system analysis, system architecture and its component design, as well as the overall system operations. The several results obtained from this study on data privacy model shows that when two or more data privacy model is combined we tend to have a more stronger privacy to our data, and when fog storage gateway have several advantages over using the traditional cloud storage, from our result shows fog has reduced latency/delay, low bandwidth consumption, and energy usage when been compare with cloud storage, therefore, fog storage will help to lessen excessive cost. This paper dwelt more on the system descriptions, the researchers focused on the research design and framework design for the data privacy model, data storage, and real-time analytics. This paper also shows the major system components and their framework specification. And lastly, the overall research system architecture was shown, its structure, and its interrelationships.

Keywords: IoT, fog, cloud, data analysis, data privacy

Procedia PDF Downloads 69
5201 Healthcare Big Data Analytics Using Hadoop

Authors: Chellammal Surianarayanan

Abstract:

Healthcare industry is generating large amounts of data driven by various needs such as record keeping, physician’s prescription, medical imaging, sensor data, Electronic Patient Record(EPR), laboratory, pharmacy, etc. Healthcare data is so big and complex that they cannot be managed by conventional hardware and software. The complexity of healthcare big data arises from large volume of data, the velocity with which the data is accumulated and different varieties such as structured, semi-structured and unstructured nature of data. Despite the complexity of big data, if the trends and patterns that exist within the big data are uncovered and analyzed, higher quality healthcare at lower cost can be provided. Hadoop is an open source software framework for distributed processing of large data sets across clusters of commodity hardware using a simple programming model. The core components of Hadoop include Hadoop Distributed File System which offers way to store large amount of data across multiple machines and MapReduce which offers way to process large data sets with a parallel, distributed algorithm on a cluster. Hadoop ecosystem also includes various other tools such as Hive (a SQL-like query language), Pig (a higher level query language for MapReduce), Hbase(a columnar data store), etc. In this paper an analysis has been done as how healthcare big data can be processed and analyzed using Hadoop ecosystem.

Keywords: big data analytics, Hadoop, healthcare data, towards quality healthcare

Procedia PDF Downloads 381
5200 Predictive Analytics in Traffic Flow Management: Integrating Temporal Dynamics and Traffic Characteristics to Estimate Travel Time

Authors: Maria Ezziani, Rabie Zine, Amine Amar, Ilhame Kissani

Abstract:

This paper introduces a predictive model for urban transportation engineering, which is vital for efficient traffic management. Utilizing comprehensive datasets and advanced statistical techniques, the model accurately forecasts travel times by considering temporal variations and traffic dynamics. Machine learning algorithms, including regression trees and neural networks, are employed to capture sequential dependencies. Results indicate significant improvements in predictive accuracy, particularly during peak hours and holidays, with the incorporation of traffic flow and speed variables. Future enhancements may integrate weather conditions and traffic incidents. The model's applications range from adaptive traffic management systems to route optimization algorithms, facilitating congestion reduction and enhancing journey reliability. Overall, this research extends beyond travel time estimation, offering insights into broader transportation planning and policy-making realms, empowering stakeholders to optimize infrastructure utilization and improve network efficiency.

Keywords: predictive analytics, traffic flow, travel time estimation, urban transportation, machine learning, traffic management

Procedia PDF Downloads 33
5199 3D Printing: Rebounding from Global Supply Chain Disruption Due to Natural Disaster

Authors: Gurjinder Singh, Jasmeen Kaur, Mukul Dhiman

Abstract:

This paper mainly describes the significance of 3D printing in the supply chain management in a scenario when there is disruption in global supply chain. Furthermore, the development and implementation of supply chain strategies in context of 3D printing technology is framed to make supply chain of an organization resilient to disruption caused by natural disasters.

Keywords: 3D printing, global supply chain, supply chain management, supply chain strategies

Procedia PDF Downloads 450