Search results for: data space connector
26579 An Approach for Multilayered Ecological Networks
Authors: N. F. F. Ebecken, G. C. Pereira
Abstract:
Although networks provide a powerful approach to the study of a wide variety of ecological systems, their formulation usually does not include various types of interactions, interactions that vary in space and time, and interconnected systems such as networks. The emerging field of 'multilayer networks' provides a natural framework for extending ecological systems analysis to include these multiple layers of complexity as it specifically allows for differentiation and modeling of intralayer and interlayer connectivity. The structure provides a set of concepts and tools that can be adapted and applied to the ecology, facilitating research in high dimensionality, heterogeneous systems in nature. Here, ecological multilayer networks are formally defined based on a review of prior and related approaches, illustrates their application and potential with existing data analyzes, and discusses limitations, challenges, and future applications. The integration of multilayer network theory into ecology offers a largely untapped potential to further address ecological complexity, to finally provide new theoretical and empirical insights into the architecture and dynamics of ecological systems.Keywords: ecological networks, multilayered networks, sea ecology, Brazilian Coastal Area
Procedia PDF Downloads 16226578 Research on Urban Design Method of Ancient City Guided by Catalyst Theory
Authors: Wang Zhiwei, Wang Weiwu
Abstract:
The process of urbanization in China has entered a critical period of transformation from urban expansion and construction to delicate urban design, thus forming a new direction in the field of urban design. So far, catalyst theory has become a prominent guiding strategy in urban planning and design. In this paper, under the background of urban renewal, catalyst theory is taken as the guiding ideology to explore the method of urban design in shouxian county. Firstly, this study briefly introduces and analyzes the catalyst theory. Through field investigation, it is found that the city has a large number of idle Spaces, such as abandoned factories and schools. In the design, the idle Spaces in the county town are utilized and interlinked in space, and functional interaction is carried out from the pattern of the county town. On the one hand, the results showed that the catalyst theory can enhance the vitality of the linear street space with a small amount of monomer construction. On the other hand, the city can also increase the cultural and economic sites of the city without damaging the historical relics and the sense of alterations of the ancient city, to improve the quality of life and quality of life of citizens. The city micro-transformation represented by catalyst theory can help ancient cities like shouxian to realize the activation of the old city and realize the gradual development.Keywords: catalytic theory, urban design, China's ancient city, Renaissance
Procedia PDF Downloads 12826577 AI-Driven Solutions for Optimizing Master Data Management
Authors: Srinivas Vangari
Abstract:
In the era of big data, ensuring the accuracy, consistency, and reliability of critical data assets is crucial for data-driven enterprises. Master Data Management (MDM) plays a crucial role in this endeavor. This paper investigates the role of Artificial Intelligence (AI) in enhancing MDM, focusing on how AI-driven solutions can automate and optimize various stages of the master data lifecycle. By integrating AI (Quantitative and Qualitative Analysis) into processes such as data creation, maintenance, enrichment, and usage, organizations can achieve significant improvements in data quality and operational efficiency. Quantitative analysis is employed to measure the impact of AI on key metrics, including data accuracy, processing speed, and error reduction. For instance, our study demonstrates an 18% improvement in data accuracy and a 75% reduction in duplicate records across multiple systems post-AI implementation. Furthermore, AI’s predictive maintenance capabilities reduced data obsolescence by 22%, as indicated by statistical analyses of data usage patterns over a 12-month period. Complementing this, a qualitative analysis delves into the specific AI-driven strategies that enhance MDM practices, such as automating data entry and validation, which resulted in a 28% decrease in manual errors. Insights from case studies highlight how AI-driven data cleansing processes reduced inconsistencies by 25% and how AI-powered enrichment strategies improved data relevance by 24%, thus boosting decision-making accuracy. The findings demonstrate that AI significantly enhances data quality and integrity, leading to improved enterprise performance through cost reduction, increased compliance, and more accurate, real-time decision-making. These insights underscore the value of AI as a critical tool in modern data management strategies, offering a competitive edge to organizations that leverage its capabilities.Keywords: artificial intelligence, master data management, data governance, data quality
Procedia PDF Downloads 2326576 The Development of the Spatial and Hierarchic Urban Structure of the Ultra-Orthodox Jewish Population in Israel
Authors: Lee Cahaner, Nissim Leon
Abstract:
The segregation of populations is one of the main axes in the research of urban geography, which refers to the spatial and functional relationships between settlements. In Israel, this phenomenon has its unique expression in the spatial processes concerning the ultra-orthodox population. This population holds a set of interactions within itself as well as with the non-orthodox surrounding population because of historical and contemporary motivations on its which strength depends on its homogeneousness and separation. Its demographic growth rate and the internal social processes that the ultra-orthodox society undergoes create a new image of the ultra-orthodox concentration and its location in the Israeli space. The goals of the present study have also been defined with the express intention of filling the scholarly vacuum noted above: firstly, to discuss the development of the Israeli ultra-Orthodox sector’s hierarchical and spatial structure as of 2015, in light of the principles and mechanisms that guide it and vis-à-vis the general population’s hierarchical locality system; secondly, to map Israel’s ultra-Orthodox population, with attention to its physical boundaries, its subdivisions (Hassidic, Lithuanian, Sephardic) and the geographical and demographic processes that have characterized it in recent years; and thirdly, to shed light on the interactions between ultra-Orthodox localities via several different parameters, e.g. migration, education, transportation, employment, consumerism and community services. In order to understand the changes in ultra-Orthodox geographic distribution and the social processes that these changes have generated, a number of research activities were conducted during the course of this study− notably, gathering and assembling material from earlier academic studies, newspaper advertisements, state and private archives; in-depth interviews with major figures in the ultra-Orthodox community and others who come into contact with it; tours of the core areas of ultra-Orthodox settlement; and gathering quantitative and qualitative data from the statistical reports of governmental and other bodies. In addition, a multi-participant (2400-respondent) quantitative survey was conducted among residents of the new ultra-Orthodox cities, designed to elucidate the attributes and spatial attitudes of the residents− as a means of tracing and understanding this new settlement pattern within ultra-Orthodox space. A major portion of the quantitative and qualitative material was processed to form a system of maps that visually describe the distribution of Israel’s ultra-Orthodox population.Keywords: migration, new cities, segregation, ultra-orthodox
Procedia PDF Downloads 40526575 Genetic Data of Deceased People: Solving the Gordian Knot
Authors: Inigo de Miguel Beriain
Abstract:
Genetic data of deceased persons are of great interest for both biomedical research and clinical use. This is due to several reasons. On the one hand, many of our diseases have a genetic component; on the other hand, we share genes with a good part of our biological family. Therefore, it would be possible to improve our response considerably to these pathologies if we could use these data. Unfortunately, at the present moment, the status of data on the deceased is far from being satisfactorily resolved by the EU data protection regulation. Indeed, the General Data Protection Regulation has explicitly excluded these data from the category of personal data. This decision has given rise to a fragmented legal framework on this issue. Consequently, each EU member state offers very different solutions. For instance, Denmark considers the data as personal data of the deceased person for a set period of time while some others, such as Spain, do not consider this data as such, but have introduced some specifically focused regulations on this type of data and their access by relatives. This is an extremely dysfunctional scenario from multiple angles, not least of which is scientific cooperation at the EU level. This contribution attempts to outline a solution to this dilemma through an alternative proposal. Its main hypothesis is that, in reality, health data are, in a sense, a rara avis within data in general because they do not refer to one person but to several. Hence, it is possible to think that all of them can be considered data subjects (although not all of them can exercise the corresponding rights in the same way). When the person from whom the data were obtained dies, the data remain as personal data of his or her biological relatives. Hence, the general regime provided for in the GDPR may apply to them. As these are personal data, we could go back to thinking in terms of a general prohibition of data processing, with the exceptions provided for in Article 9.2 and on the legal bases included in Article 6. This may be complicated in practice, given that, since we are dealing with data that refer to several data subjects, it may be complex to refer to some of these bases, such as consent. Furthermore, there are theoretical arguments that may oppose this hypothesis. In this contribution, it is shown, however, that none of these objections is of sufficient substance to delegitimize the argument exposed. Therefore, the conclusion of this contribution is that we can indeed build a general framework on the processing of personal data of deceased persons in the context of the GDPR. This would constitute a considerable improvement over the current regulatory framework, although it is true that some clarifications will be necessary for its practical application.Keywords: collective data conceptual issues, data from deceased people, genetic data protection issues, GDPR and deceased people
Procedia PDF Downloads 15726574 Clouds Influence on Atmospheric Ozone from GOME-2 Satellite Measurements
Authors: S. M. Samkeyat Shohan
Abstract:
This study is mainly focused on the determination and analysis of the photolysis rate of atmospheric, specifically tropospheric, ozone as function of cloud properties through-out the year 2007. The observational basis for ozone concentrations and cloud properties are the measurement data set of the Global Ozone Monitoring Experiment-2 (GOME-2) sensor on board the polar orbiting Metop-A satellite. Two different spectral ranges are used; ozone total column are calculated from the wavelength window 325 – 335 nm, while cloud properties, such as cloud top height (CTH) and cloud optical thick-ness (COT) are derived from the absorption band of molecular oxygen centered at 761 nm. Cloud fraction (CF) is derived from measurements in the ultraviolet, visible and near-infrared range of GOME-2. First, ozone concentrations above clouds are derived from ozone total columns, subtracting the contribution of stratospheric ozone and filtering those satellite measurements which have thin and low clouds. Then, the values of ozone photolysis derived from observations are compared with theoretical modeled results, in the latitudinal belt 5˚N-5˚S and 20˚N - 20˚S, as function of CF and COT. In general, good agreement is found between the data and the model, proving both the quality of the space-borne ozone and cloud properties as well as the modeling theory of ozone photolysis rate. The found discrepancies can, however, amount to approximately 15%. Latitudinal seasonal changes of photolysis rate of ozone are found to be negatively correlated to changes in upper-tropospheric ozone concentrations only in the autumn and summer months within the northern and southern tropical belts, respectively. This fact points to the entangled roles of temperature and nitrogen oxides in the ozone production, which are superimposed on its sole photolysis induced by thick and high clouds in the tropics.Keywords: cloud properties, photolysis rate, stratospheric ozone, tropospheric ozone
Procedia PDF Downloads 21526573 Continuous Measurement of Spatial Exposure Based on Visual Perception in Three-Dimensional Space
Authors: Nanjiang Chen
Abstract:
In the backdrop of expanding urban landscapes, accurately assessing spatial openness is critical. Traditional visibility analysis methods grapple with discretization errors and inefficiencies, creating a gap in truly capturing the human experi-ence of space. Addressing these gaps, this paper introduces a distinct continuous visibility algorithm, a leap in measuring urban spaces from a human-centric per-spective. This study presents a methodological breakthrough by applying this algorithm to urban visibility analysis. Unlike conventional approaches, this tech-nique allows for a continuous range of visibility assessment, closely mirroring hu-man visual perception. By eliminating the need for predefined subdivisions in ray casting, it offers a more accurate and efficient tool for urban planners and architects. The proposed algorithm not only reduces computational errors but also demonstrates faster processing capabilities, validated through a case study in Bei-jing's urban setting. Its key distinction lies in its potential to benefit a broad spec-trum of stakeholders, ranging from urban developers to public policymakers, aid-ing in the creation of urban spaces that prioritize visual openness and quality of life. This advancement in urban analysis methods could lead to more inclusive, comfortable, and well-integrated urban environments, enhancing the spatial experience for communities worldwide.Keywords: visual openness, spatial continuity, ray-tracing algorithms, urban computation
Procedia PDF Downloads 5226572 Steps towards the Development of National Health Data Standards in Developing Countries
Authors: Abdullah I. Alkraiji, Thomas W. Jackson, Ian Murray
Abstract:
The proliferation of health data standards today is somewhat overlapping and conflicting, resulting in market confusion and leading to increasing proprietary interests. The government role and support in standardization for health data are thought to be crucial in order to establish credible standards for the next decade, to maximize interoperability across the health sector, and to decrease the risks associated with the implementation of non-standard systems. The normative literature missed out the exploration of the different steps required to be undertaken by the government towards the development of national health data standards. Based on the lessons learned from a qualitative study investigating the different issues to the adoption of health data standards in the major tertiary hospitals in Saudi Arabia and the opinions and feedback from different experts in the areas of data exchange and standards and medical informatics in Saudi Arabia and UK, a list of steps required towards the development of national health data standards was constructed. Main steps are the existence of: a national formal reference for health data standards, an agreed national strategic direction for medical data exchange, a national medical information management plan and a national accreditation body, and more important is the change management at the national and organizational level. The outcome of this study can be used by academics and practitioners to develop the planning of health data standards, and in particular those in developing countries.Keywords: interoperabilty, medical data exchange, health data standards, case study, Saudi Arabia
Procedia PDF Downloads 34526571 Applications of Space Technology in Flood Risk Mapping in Parts of Haryana State, India
Authors: B. S. Chaudhary
Abstract:
The severity and frequencies of different disasters on the globe is increasing in recent years. India is also facing the disasters in the form of drought, cyclone, earthquake, landslides, and floods. One of the major causes of disasters in northern India is flood. There are great losses and extensive damage to the agricultural crops, property, human, and animal life. This is causing environmental imbalances at places. The annual global figures for losses due to floods run into over 2 billion dollar. India is a vast country with wide variations in climate and topography. Due to widespread and heavy rainfall during the monsoon months, floods of varying magnitude occur all over the country during June to September. The magnitude depends upon the intensity of rainfall, its duration and also the ground conditions at the time of rainfall. Haryana, one of the agriculturally dominated northern states is also suffering from a number of disasters such as floods, desertification, soil erosion, land degradation etc. Earthquakes are also frequently occurring but of small magnitude so are not causing much concern and damage. Most of the damage in Haryana is due to floods. Floods in Haryana have occurred in 1978, 1988, 1993, 1995, 1998, and 2010 to mention a few. The present paper deals with the Remote Sensing and GIS applications in preparing flood risk maps in parts of Haryana State India. The satellite data of various years have been used for mapping of flood affected areas. The Flooded areas have been interpreted both visually and digitally and two classes-flooded and receded water/ wet areas have been identified for each year. These have been analyzed in GIS environment to prepare the risk maps. This shows the areas of high, moderate and low risk depending on the frequency of flood witness. The floods leave a trail of suffering in the form of unhygienic conditions due to improper sanitation, water logging, filth littered in the area, degradation of materials and unsafe drinking water making the people prone to many type diseases in short and long run. Attempts have also been made to enumerate the causes of floods. The suggestions are given for mitigating the fury of floods and proper management issues related to evacuation and safe places nearby.Keywords: flood mapping, GIS, Haryana, India, remote sensing, space technology
Procedia PDF Downloads 21126570 Genetic Algorithms for Feature Generation in the Context of Audio Classification
Authors: José A. Menezes, Giordano Cabral, Bruno T. Gomes
Abstract:
Choosing good features is an essential part of machine learning. Recent techniques aim to automate this process. For instance, feature learning intends to learn the transformation of raw data into a useful representation to machine learning tasks. In automatic audio classification tasks, this is interesting since the audio, usually complex information, needs to be transformed into a computationally convenient input to process. Another technique tries to generate features by searching a feature space. Genetic algorithms, for instance, have being used to generate audio features by combining or modifying them. We find this approach particularly interesting and, despite the undeniable advances of feature learning approaches, we wanted to take a step forward in the use of genetic algorithms to find audio features, combining them with more conventional methods, like PCA, and inserting search control mechanisms, such as constraints over a confusion matrix. This work presents the results obtained on particular audio classification problems.Keywords: feature generation, feature learning, genetic algorithm, music information retrieval
Procedia PDF Downloads 43826569 Agile Methodology for Modeling and Design of Data Warehouses -AM4DW-
Authors: Nieto Bernal Wilson, Carmona Suarez Edgar
Abstract:
The organizations have structured and unstructured information in different formats, sources, and systems. Part of these come from ERP under OLTP processing that support the information system, however these organizations in OLAP processing level, presented some deficiencies, part of this problematic lies in that does not exist interesting into extract knowledge from their data sources, as also the absence of operational capabilities to tackle with these kind of projects. Data Warehouse and its applications are considered as non-proprietary tools, which are of great interest to business intelligence, since they are repositories basis for creating models or patterns (behavior of customers, suppliers, products, social networks and genomics) and facilitate corporate decision making and research. The following paper present a structured methodology, simple, inspired from the agile development models as Scrum, XP and AUP. Also the models object relational, spatial data models, and the base line of data modeling under UML and Big data, from this way sought to deliver an agile methodology for the developing of data warehouses, simple and of easy application. The methodology naturally take into account the application of process for the respectively information analysis, visualization and data mining, particularly for patterns generation and derived models from the objects facts structured.Keywords: data warehouse, model data, big data, object fact, object relational fact, process developed data warehouse
Procedia PDF Downloads 41526568 Computed Tomography Brain and Inpatient Falls: An Audit Evaluating the Indications and Outcomes
Authors: Zain Khan, Steve Ahn, Kathy Monypenny, James Fink
Abstract:
In Australian public hospitals, there were approximately 34,000 reported inpatient falls between 2015 to 2016. The gold standard for diagnosing intracranial injury is non-contrast enhanced brain computed tomography (CTB). Over a three-month timeframe, a total of one hundred and eighty (180) falls were documented between the hours of 4pm and 8am at a large metro hospital. Only three (3) of these scans demonstrated a positive intra-cranial finding. The rationale for scanning varied. The common indications included a fall with head strike, the presence of blood thinning medication, loss of consciousness, reduced Glasgow Coma Scale (GCS), vomiting and new neurological findings. There are several validated tools to aid in decision-making around ordering CTB scans in the acute setting, but no such accepted tool exists for the inpatient space. With further data collection, spanning a greater length of time and through involving multiple centres, work can be done towards generating such a tool that can be utilized for inpatient falls.Keywords: computed tomography, falls, inpatient, intracranial hemorrhage
Procedia PDF Downloads 17826567 Identifying Model to Predict Deterioration of Water Mains Using Robust Analysis
Authors: Go Bong Choi, Shin Je Lee, Sung Jin Yoo, Gibaek Lee, Jong Min Lee
Abstract:
In South Korea, it is difficult to obtain data for statistical pipe assessment. In this paper, to address these issues, we find that various statistical model presented before is how data mixed with noise and are whether apply in South Korea. Three major type of model is studied and if data is presented in the paper, we add noise to data, which affects how model response changes. Moreover, we generate data from model in paper and analyse effect of noise. From this we can find robustness and applicability in Korea of each model.Keywords: proportional hazard model, survival model, water main deterioration, ecological sciences
Procedia PDF Downloads 74826566 Spin-Flip and Magnetoelectric Coupling in Acentric and Non-Polar Pb₂MnO₄
Authors: K. D. Chandrasekhar, H. C. Wu, D. J. Hsieh, B. J. Song, J. -Y. Lin, J. L. Her, L. Z. Deng, M. Gooch, C. W. Chu, H. D. Yang
Abstract:
Stress-mediated coupling of electrical and magnetic dipoles in a single phase multiferroic is rare. Pb₂MnO₄ belong to multi-piezo crystal class with the space group P⁻42₁Keywords: multiferroic, multipiezo, Pb₂MnO₄, spin-flip
Procedia PDF Downloads 24026565 Effects of Planned Pre-laboratory Discussion on Physics Students’ Acquisition of Science Process Skills in Kontagora, Niger State
Authors: Akano Benedict Ubawuike
Abstract:
This study investigated the effects of pre-laboratory discussion on physics students’ acquisition of science process skills. The study design was quasi-experimental and purposive sampling technique was applied in selecting two schools in Kontagora Town for the research based on the availability of a good physics laboratory. Intact classes already grouped by the school for the sake of small laboratory space and equipment, comprising Thirty (30) students, 15 for experimental group in School A and 15 for control in school B were the subjects for the research. The instrument used for data collection was the lesson prepared for pre – practical discussion and researcher made Science Process Skill Test (SPST ) and two (2) research questions, and two (2) research hypotheses were developed to guide the study. The data collected were analyzed using means and t-Test statistics at 0.05 level of significance. The study revealed that pre-laboratory discussion was found to be more efficacious in enhancing students’ acquisition of science process skills. It also revealed that gender, had no significant effect on students’ acquisition of science process skills. Based on the findings, it was recommended among others that teachers should encourage students to develop interest in practical activities by engaging them in pre-laboratory discussion and providing instructional materials that will challenge them to be actively involved during practical lessons. It is also recommended that Ministries of Education and professional organizations like Science Teachers' Association of Nigeria (STAN) should organize workshops, seminars and conferences for physics teachers and Physics concepts should be taught with practical activity so that the students will do science instead of learning about science.Keywords: physics, laboratory, discussion, students, acquisition, science process skills
Procedia PDF Downloads 14026564 Charging-Vacuum Helium Mass Spectrometer Leak Detection Technology in the Application of Space Products Leak Testing and Error Control
Authors: Jijun Shi, Lichen Sun, Jianchao Zhao, Lizhi Sun, Enjun Liu, Chongwu Guo
Abstract:
Because of the consistency of pressure direction, more short cycle, and high sensitivity, Charging-Vacuum helium mass spectrometer leak testing technology is the most popular leak testing technology for the seal testing of the spacecraft parts, especially the small and medium size ones. Usually, auxiliary pump was used, and the minimum detectable leak rate could reach 5E-9Pa•m3/s, even better on certain occasions. Relative error is more important when evaluating the results. How to choose the reference leak, the background level of helium, and record formats would affect the leak rate tested. In the linearity range of leak testing system, it would reduce 10% relative error if the reference leak with larger leak rate was used, and the relative error would reduce obviously if the background of helium was low efficiently, the record format of decimal was used, and the more stable data were recorded.Keywords: leak testing, spacecraft parts, relative error, error control
Procedia PDF Downloads 46226563 Bacteriophage Is a Novel Solution of Therapy Against S. aureus Having Multiple Drug Resistance
Authors: Sanjay Shukla, A. Nayak, R. K. Sharma, A. P. Singh, S. P. Tiwari
Abstract:
Excessive use of antibiotics is a major problem in the treatment of wounds and other chronic infections, and antibiotic treatment is frequently non-curative, thus alternative treatment is necessary. Phage therapy is considered one of the most promising approaches to treat multi-drug resistant bacterial pathogens. Infections caused by Staphylococcus aureus are very efficiently controlled with phage cocktails, containing a different individual phages lysate infecting a majority of known pathogenic S. aureus strains. The aim of the present study was to evaluate the efficacy of a purified phage cocktail for prophylactic as well as therapeutic application in mouse model and in large animals with chronic septic infection of wounds. A total of 150 sewage samples were collected from various livestock farms. These samples were subjected for the isolation of bacteriophage by the double agar layer method. A total of 27 sewage samples showed plaque formation by producing lytic activity against S. aureus in the double agar overlay method out of 150 sewage samples. In TEM, recovered isolates of bacteriophages showed hexagonal structure with tail fiber. In the bacteriophage (ØVS) had an icosahedral symmetry with the head size 52.20 nm in diameter and long tail of 109 nm. Head and tail were held together by connector and can be classified as a member of the Myoviridae family under the order of Caudovirale. Recovered bacteriophage had shown the antibacterial activity against the S. aureus in vitro. Cocktail (ØVS1, ØVS5, ØVS9, and ØVS 27) of phage lysate were tested to know in vivo antibacterial activity as well as the safety profile. Result of mice experiment indicated that the bacteriophage lysate were very safe, did not show any appearance of abscess formation, which indicates its safety in living system. The mice were also prophylactically protected against S. aureus when administered with cocktail of bacteriophage lysate just before the administration of S. aureuswhich indicates that they are good prophylactic agent. The S. aureusinoculated mice were completely recovered by bacteriophage administration with 100% recovery, which was very good as compere to conventional therapy. In the present study, ten chronic cases of the wound were treated with phage lysate, and follow up of these cases was done regularly up to ten days (at 0, 5, and 10 d). The result indicated that the six cases out of ten showed complete recovery of wounds within 10 d. The efficacy of bacteriophage therapy was found to be 60% which was very good as compared to the conventional antibiotic therapy in chronic septic wounds infections. Thus, the application of lytic phage in single dose proved to be innovative and effective therapy for the treatment of septic chronic wounds.Keywords: phage therapy, S aureus, antimicrobial resistance, lytic phage, and bacteriophage
Procedia PDF Downloads 12126562 Automated Testing to Detect Instance Data Loss in Android Applications
Authors: Anusha Konduru, Zhiyong Shan, Preethi Santhanam, Vinod Namboodiri, Rajiv Bagai
Abstract:
Mobile applications are increasing in a significant amount, each to address the requirements of many users. However, the quick developments and enhancements are resulting in many underlying defects. Android apps create and handle a large variety of 'instance' data that has to persist across runs, such as the current navigation route, workout results, antivirus settings, or game state. Due to the nature of Android, an app can be paused, sent into the background, or killed at any time. If the instance data is not saved and restored between runs, in addition to data loss, partially-saved or corrupted data can crash the app upon resume or restart. However, it is difficult for the programmer to manually test this issue for all the activities. This results in the issue of data loss that the data entered by the user are not saved when there is any interruption. This issue can degrade user experience because the user needs to reenter the information each time there is an interruption. Automated testing to detect such data loss is important to improve the user experience. This research proposes a tool, DroidDL, a data loss detector for Android, which detects the instance data loss from a given android application. We have tested 395 applications and found 12 applications with the issue of data loss. This approach is proved highly accurate and reliable to find the apps with this defect, which can be used by android developers to avoid such errors.Keywords: Android, automated testing, activity, data loss
Procedia PDF Downloads 23826561 Substation Automation, Digitization, Cyber Risk and Chain Risk Management Reliability
Authors: Serzhan Ashirov, Dana Nour, Rafat Rob, Khaled Alotaibi
Abstract:
There has been a fast growth in the introduction and use of communications, information, monitoring, and sensing technologies. The new technologies are making their way to the Industrial Control Systems as embedded in products, software applications, IT services, or commissioned to enable integration and automation of increasingly global supply chains. As a result, the lines that separated the physical, digital, and cyber world have diminished due to the vast implementation of the new, disruptive digital technologies. The variety and increased use of these technologies introduce many cybersecurity risks affecting cyber-resilience of the supply chain, both in terms of the product or service delivered to a customer and members of the supply chain operation. US department of energy considers supply chain in the IR4 space to be the weakest link in cybersecurity. The IR4 identified the digitization of the field devices, followed by digitalization that eventually moved through the digital transformation space with little care for the new introduced cybersecurity risks. This paper will examine the best methodologies for securing the electrical substations from cybersecurity attacks due to supply chain risks, and due to digitization effort. SCADA systems are the most vulnerable part of the power system infrastructure due to digitization and due to the weakness and vulnerabilities in the supply chain security. The paper will discuss in details how create a secure supply chain methodology, secure substations, and mitigate the risks due to digitizationKeywords: cybersecurity, supply chain methodology, secure substation, digitization
Procedia PDF Downloads 6726560 Big Data: Appearance and Disappearance
Authors: James Moir
Abstract:
The mainstay of Big Data is prediction in that it allows practitioners, researchers, and policy analysts to predict trends based upon the analysis of large and varied sources of data. These can range from changing social and political opinions, patterns in crimes, and consumer behaviour. Big Data has therefore shifted the criterion of success in science from causal explanations to predictive modelling and simulation. The 19th-century science sought to capture phenomena and seek to show the appearance of it through causal mechanisms while 20th-century science attempted to save the appearance and relinquish causal explanations. Now 21st-century science in the form of Big Data is concerned with the prediction of appearances and nothing more. However, this pulls social science back in the direction of a more rule- or law-governed reality model of science and away from a consideration of the internal nature of rules in relation to various practices. In effect Big Data offers us no more than a world of surface appearance and in doing so it makes disappear any context-specific conceptual sensitivity.Keywords: big data, appearance, disappearance, surface, epistemology
Procedia PDF Downloads 42526559 Exploring the Feasibility of Utilizing Blockchain in Cloud Computing and AI-Enabled BIM for Enhancing Data Exchange in Construction Supply Chain Management
Authors: Tran Duong Nguyen, Marwan Shagar, Qinghao Zeng, Aras Maqsoodi, Pardis Pishdad, Eunhwa Yang
Abstract:
Construction supply chain management (CSCM) involves the collaboration of many disciplines and actors, which generates vast amounts of data. However, inefficient, fragmented, and non-standardized data storage often hinders this data exchange. The industry has adopted building information modeling (BIM) -a digital representation of a facility's physical and functional characteristics to improve collaboration, enhance transmission security, and provide a common data exchange platform. Still, the volume and complexity of data require tailored information categorization, aligning with stakeholders' preferences and demands. To address this, artificial intelligence (AI) can be integrated to handle this data’s magnitude and complexities. This research aims to develop an integrated and efficient approach for data exchange in CSCM by utilizing AI. The paper covers five main objectives: (1) Investigate existing framework and BIM adoption; (2) Identify challenges in data exchange; (3) Propose an integrated framework; (4) Enhance data transmission security; and (5) Develop data exchange in CSCM. The proposed framework demonstrates how integrating BIM and other technologies, such as cloud computing, blockchain, and AI applications, can significantly improve the efficiency and accuracy of data exchange in CSCM.Keywords: construction supply chain management, BIM, data exchange, artificial intelligence
Procedia PDF Downloads 3526558 Representation Data without Lost Compression Properties in Time Series: A Review
Authors: Nabilah Filzah Mohd Radzuan, Zalinda Othman, Azuraliza Abu Bakar, Abdul Razak Hamdan
Abstract:
Uncertain data is believed to be an important issue in building up a prediction model. The main objective in the time series uncertainty analysis is to formulate uncertain data in order to gain knowledge and fit low dimensional model prior to a prediction task. This paper discusses the performance of a number of techniques in dealing with uncertain data specifically those which solve uncertain data condition by minimizing the loss of compression properties.Keywords: compression properties, uncertainty, uncertain time series, mining technique, weather prediction
Procedia PDF Downloads 43426557 Noninvasive Technique for Measurement of Heartbeat in Zebrafish Embryos Exposed to Electromagnetic Fields at 27 GHz
Authors: Sara Ignoto, Elena M. Scalisi, Carmen Sica, Martina Contino, Greta Ferruggia, Antonio Salvaggio, Santi C. Pavone, Gino Sorbello, Loreto Di Donato, Roberta Pecoraro, Maria V. Brundo
Abstract:
The new fifth generation technology (5G), which should favor high data-rate connections (1Gbps) and latency times lower than the current ones (<1ms), has the characteristic of working on different frequency bands of the radio wave spectrum (700 MHz, 3.6-3.8 GHz and 26.5-27.5 GHz), thus also exploiting higher frequencies than previous mobile radio generations (1G-4G). The higher frequency waves, however, have a lower capacity to propagate in free space and therefore, in order to guarantee the capillary coverage of the territory for high reliability applications, it will be necessary to install a large number of repeaters. Following the introduction of this new technology, there has been growing concern in recent years about the possible harmful effects on human health and several studies were published using several animal models. This study aimed to observe the possible short-term effects induced by 5G-millimeter waves on heartbeat of early life stages of Danio rerio using DanioScope software (Noldus). DanioScope is the complete toolbox for measurements on zebrafish embryos and larvae. The effect of substances can be measured on the developing zebrafish embryo by a range of parameters: earliest activity of the embryo’s tail, activity of the developing heart, speed of blood flowing through the vein, length and diameters of body parts. Activity measurements, cardiovascular data, blood flow data and morphometric parameters can be combined in one single tool. Obtained data are elaborate and provided by the software both numerical as well as graphical. The experiments were performed at 27 GHz by a no commercial high gain pyramidal horn antenna. According to OECD guidelines, exposure to 5G-millimeter waves was tested by fish embryo toxicity test within 96 hours post fertilization, Observations were recorded every 24h, until the end of the short-term test (96h). The results have showed an increase of heartbeat rate on exposed embryos at 48h hpf than control group, but this increase has not been shown at 72-96 h hpf. Nowadays, there is a scant of literature data about this topic, so these results could be useful to approach new studies and also to evaluate potential cardiotoxic effects of mobile radiofrequency.Keywords: Danio rerio, DanioScope, cardiotoxicity, millimeter waves.
Procedia PDF Downloads 19826556 Investigating the Aerosol Load of Eastern Mediterranean Basin with Sentinel-5p Satellite
Authors: Deniz Yurtoğlu
Abstract:
Aerosols directly affect the radiative balance of the earth by absorbing and/or scattering the sun rays reaching the atmosphere and indirectly affect the balance by acting as a nucleus in cloud formation. The composition, physical, and chemical properties of aerosols vary depending on their sources and the time spent in the atmosphere. The Eastern Mediterranean Basin has a high aerosol load that is formed from different sources; such as anthropogenic activities, desert dust outbreaks, and the spray of sea salt; and the area is subjected to atmospheric transport from other locations on the earth. This region, which includes the deserts of Africa, the Middle East, and the Mediterranean sea, is one of the most affected areas by climate change due to its location and the chemistry of the atmosphere. This study aims to investigate the spatiotemporal deviation of aerosol load in the Eastern Mediterranean Basin between the years 2018-2022 with the help of a new pioneer satellite of ESA (European Space Agency), Sentinel-5P. The TROPOMI (The TROPOspheric Monitoring Instrument) traveling on this low-Earth orbiting satellite is a UV (Ultraviolet)-sensing spectrometer with a resolution of 5.5 km x 3.5 km, which can make measurements even in a cloud-covered atmosphere. By using Absorbing Aerosol Index data produced by this spectrometer and special scripts written in Python language that transforms this data into images, it was seen that the majority of the aerosol load in the Eastern Mediterranean Basin is sourced from desert dust and anthropogenic activities. After retrieving the daily data, which was separated from the NaN values, seasonal analyses match with the normal aerosol variations expected, which are high in warm seasons and lower in cold seasons. Monthly analyses showed that in four years, there was an increase in the amount of Absorbing Aerosol Index in spring and winter by 92.27% (2019-2021) and 39.81% (2019-2022), respectively. On the other hand, in the summer and autumn seasons, a decrease has been observed by 20.99% (2018-2021) and 0.94% (2018-2021), respectively. The overall variation of the mean absorbing aerosol index from TROPOMI between April 2018 to April 2022 reflects a decrease of 115.87% by annual mean from 0.228 to -0.036. However, when the data is analyzed by the annual mean values of the years which have the data from January to December, meaning from 2019 to 2021, there was an increase of 57.82% increase (0.108-0.171). This result can be interpreted as the effect of climate change on the aerosol load and also, more specifically, the effect of forest fires that happened in the summer months of 2021.Keywords: aerosols, eastern mediterranean basin, sentinel-5p, tropomi, aerosol index, remote sensing
Procedia PDF Downloads 7326555 Data Mining As A Tool For Knowledge Management: A Review
Authors: Maram Saleh
Abstract:
Knowledge has become an essential resource in today’s economy and become the most important asset of maintaining competition advantage in organizations. The importance of knowledge has made organizations to manage their knowledge assets and resources through all multiple knowledge management stages such as: Knowledge Creation, knowledge storage, knowledge sharing and knowledge use. Researches on data mining are continues growing over recent years on both business and educational fields. Data mining is one of the most important steps of the knowledge discovery in databases process aiming to extract implicit, unknown but useful knowledge and it is considered as significant subfield in knowledge management. Data miming have the great potential to help organizations to focus on extracting the most important information on their data warehouses. Data mining tools and techniques can predict future trends and behaviors, allowing businesses to make proactive, knowledge-driven decisions. This review paper explores the applications of data mining techniques in supporting knowledge management process as an effective knowledge discovery technique. In this paper, we identify the relationship between data mining and knowledge management, and then focus on introducing some application of date mining techniques in knowledge management for some real life domains.Keywords: Data Mining, Knowledge management, Knowledge discovery, Knowledge creation.
Procedia PDF Downloads 21326554 Assessing the Effectiveness of Warehousing Facility Management: The Case of Mantrac Ghana Limited
Authors: Kuhorfah Emmanuel Mawuli
Abstract:
Generally, for firms to enhance their operational efficiency of logistics, it is imperative to assess the logistics function. The cost of logistics conventionally represents a key consideration in the pricing decisions of firms, which suggests that cost efficiency in logistics can go a long way to improve margins. Warehousing, which is a key part of logistics operations, has the prospect of influencing operational efficiency in logistics management as well as customer value, but this potential has often not been recognized. It has been found that there is a paucity of research that evaluates the efficiency of warehouses. Indeed, limited research has been conducted to examine potential barriers to effective warehousing management. Due to this paucity of research, there is limited knowledge on how to address the obstacles associated with warehousing management. In order for warehousing management to become profitable, there is the need to integrate, balance, and manage the economic inputs and outputs of the entire warehouse operations, something that many firms tend to ignore. Management of warehousing is not solely related to storage functions. Instead, effective warehousing management requires such practices as maximum possible mechanization and automation of operations, optimal use of space and capacity of storage facilities, organization through "continuous flow" of goods, a planned system of storage operations, and safety of goods. For example, there is an important need for space utilization of the warehouse surface as it is a good way to evaluate the storing operation and pick items per hour. In the setting of Mantrac Ghana, not much knowledge regarding the management of the warehouses exists. The researcher has personally observed many gaps in the management of the warehouse facilities in the case organization Mantrac Ghana. It is important, therefore, to assess the warehouse facility management of the case company with the objective of identifying weaknesses for improvement. The study employs an in-depth qualitative research approach using interviews as a mode of data collection. Respondents in the study mainly comprised warehouse facility managers in the studied company. A total of 10 participants were selected for the study using a purposive sampling strategy. Results emanating from the study demonstrate limited warehousing effectiveness in the case company. Findings further reveal that the major barriers to effective warehousing facility management comprise poor layout, poor picking optimization, labour costs, and inaccurate orders; policy implications of the study findings are finally outlined.Keywords: assessing, warehousing, facility, management
Procedia PDF Downloads 7426553 Anomaly Detection Based Fuzzy K-Mode Clustering for Categorical Data
Authors: Murat Yazici
Abstract:
Anomalies are irregularities found in data that do not adhere to a well-defined standard of normal behavior. The identification of outliers or anomalies in data has been a subject of study within the statistics field since the 1800s. Over time, a variety of anomaly detection techniques have been developed in several research communities. The cluster analysis can be used to detect anomalies. It is the process of associating data with clusters that are as similar as possible while dissimilar clusters are associated with each other. Many of the traditional cluster algorithms have limitations in dealing with data sets containing categorical properties. To detect anomalies in categorical data, fuzzy clustering approach can be used with its advantages. The fuzzy k-Mode (FKM) clustering algorithm, which is one of the fuzzy clustering approaches, by extension to the k-means algorithm, is reported for clustering datasets with categorical values. It is a form of clustering: each point can be associated with more than one cluster. In this paper, anomaly detection is performed on two simulated data by using the FKM cluster algorithm. As a significance of the study, the FKM cluster algorithm allows to determine anomalies with their abnormality degree in contrast to numerous anomaly detection algorithms. According to the results, the FKM cluster algorithm illustrated good performance in the anomaly detection of data, including both one anomaly and more than one anomaly.Keywords: fuzzy k-mode clustering, anomaly detection, noise, categorical data
Procedia PDF Downloads 6226552 Big Data Analytics and Data Security in the Cloud via Fully Homomorphic Encyption Scheme
Authors: Victor Onomza Waziri, John K. Alhassan, Idris Ismaila, Noel Dogonyara
Abstract:
This paper describes the problem of building secure computational services for encrypted information in the Cloud. Computing without decrypting the encrypted data; therefore, it meets the yearning of computational encryption algorithmic aspiration model that could enhance the security of big data for privacy or confidentiality, availability and integrity of the data and user’s security. The cryptographic model applied for the computational process of the encrypted data is the Fully Homomorphic Encryption Scheme. We contribute a theoretical presentations in a high-level computational processes that are based on number theory that is derivable from abstract algebra which can easily be integrated and leveraged in the Cloud computing interface with detail theoretic mathematical concepts to the fully homomorphic encryption models. This contribution enhances the full implementation of big data analytics based on cryptographic security algorithm.Keywords: big data analytics, security, privacy, bootstrapping, Fully Homomorphic Encryption Scheme
Procedia PDF Downloads 48826551 Challenges of Strategies for Improving Sustainability in Urban Historical Context in Developing Countries: The Case of Shiraz Bein Al-Haramein
Authors: Amir Hossein Ashari, Sedighe Erfan Manesh
Abstract:
One of the problems in developing countries is renovating the historical context and inducing behaviors appropriate to modern life to such a context. This study was conducted using field and library methods in 2012. Similar cases carried out in Iran and developing countries were compared to unveil the strengths and weaknesses of these projects. At present, in the historical context of Shiraz, the distance between two religious shrines of Shahcheragh (Ahmad ibn Musa) and Astaneh (Sayed Alaa al-Din Hossein), which are significant places in religious, cultural, social, and economic terms, is an area full of historic places called Bein Al-Haramein. Unfortunately, some of these places have been worn out and are not appropriate for common uses. The basic strategy of Bein Al-Haramein was to improve social development of Shiraz, to enhance the vitality and dynamism of the historical context of Bein Al-Haramein and to create tourist attractions in order to boost the city's economic and social stability. To this end, the project includes the huge Bein Al-Haramein Commercial Complex which is under construction now. To construct the complex, officials have decided to demolish places of historical value which can lead to irreparable consequences. Iranian urban design has always been based on three elements of bazaars, mosques and government facilities with bazaars being the organic connector of the other elements. Therefore, the best strategy in the above case is to provide for a commercial connection between the two poles. Although this strategy is included in the project, lack of attention to renovation principles in this area and complete destruction of the context will lead to its irreversible damage and will destroy its cultural and historical identity. In urban planning of this project, some important issues have been neglected including: preserving valuable buildings and special old features of the city, rebuilding worn buildings and context to attract trust and confidence of the people, developing new models according to changes, improving the structural position of old context with minimal degradation, attracting partnerships of residents and protecting their rights and finally using potential facilities of the old context. The best strategy for achieving sustainability in Bein Al-Haramein can be the one used in the distance between Santa Maria Novella and Santa Maria Del Fiore churches in historical context where while protecting the historic context and constructions, old buildings were renovated and given different commercial and service uses making them sustainable and dynamic places. Similarly, in Bein Al-Haramein, renovating old constructions and monuments and giving different commercial and other uses to them can help improve the economic and social sustainability of the area.Keywords: Bein Al-Haramein, sustainability, historical context, historical context
Procedia PDF Downloads 44626550 An Approximation of Daily Rainfall by Using a Pixel Value Data Approach
Authors: Sarisa Pinkham, Kanyarat Bussaban
Abstract:
The research aims to approximate the amount of daily rainfall by using a pixel value data approach. The daily rainfall maps from the Thailand Meteorological Department in period of time from January to December 2013 were the data used in this study. The results showed that this approach can approximate the amount of daily rainfall with RMSE=3.343.Keywords: daily rainfall, image processing, approximation, pixel value data
Procedia PDF Downloads 391