Search results for: well data integration
24673 The Development and Validation of the Awareness to Disaster Risk Reduction Questionnaire for Teachers
Authors: Ian Phil Canlas, Mageswary Karpudewan, Joyce Magtolis, Rosario Canlas
Abstract:
This study reported the development and validation of the Awareness to Disaster Risk Reduction Questionnaire for Teachers (ADRRQT). The questionnaire is a combination of Likert scale and open-ended questions that were grouped into two parts. The first part included questions relating to the general awareness on disaster risk reduction. Whereas, the second part comprised questions regarding the integration of disaster risk reduction in the teaching process. The entire process of developing and validating of the ADRRQT was described in this study. Statistical and qualitative findings revealed that the ADRRQT is significantly valid and reliable and has the potential of measuring awareness to disaster risk reduction of stakeholders in the field of teaching. Moreover, it also shows the potential to be adopted in other fields.Keywords: awareness, development, disaster risk reduction, questionnaire, validation
Procedia PDF Downloads 22824672 Optimizing Data Transfer and Processing in Multi-Cloud Environments for Big Data Workloads
Authors: Gaurav Kumar Sinha
Abstract:
In an era defined by the proliferation of data and the utilization of cloud computing environments, the efficient transfer and processing of big data workloads across multi-cloud platforms have emerged as critical challenges. This research paper embarks on a comprehensive exploration of the complexities associated with managing and optimizing big data in a multi-cloud ecosystem.The foundation of this study is rooted in the recognition that modern enterprises increasingly rely on multiple cloud providers to meet diverse business needs, enhance redundancy, and reduce vendor lock-in. As a consequence, managing data across these heterogeneous cloud environments has become intricate, necessitating innovative approaches to ensure data integrity, security, and performance.The primary objective of this research is to investigate strategies and techniques for enhancing the efficiency of data transfer and processing in multi-cloud scenarios. It recognizes that big data workloads are characterized by their sheer volume, variety, velocity, and complexity, making traditional data management solutions insufficient for harnessing the full potential of multi-cloud architectures.The study commences by elucidating the challenges posed by multi-cloud environments in the context of big data. These challenges encompass data fragmentation, latency, security concerns, and cost optimization. To address these challenges, the research explores a range of methodologies and solutions. One of the key areas of focus is data transfer optimization. The paper delves into techniques for minimizing data movement latency, optimizing bandwidth utilization, and ensuring secure data transmission between different cloud providers. It evaluates the applicability of dedicated data transfer protocols, intelligent data routing algorithms, and edge computing approaches in reducing transfer times.Furthermore, the study examines strategies for efficient data processing across multi-cloud environments. It acknowledges that big data processing requires distributed and parallel computing capabilities that span across cloud boundaries. The research investigates containerization and orchestration technologies, serverless computing models, and interoperability standards that facilitate seamless data processing workflows.Security and data governance are paramount concerns in multi-cloud environments. The paper explores methods for ensuring data security, access control, and compliance with regulatory frameworks. It considers encryption techniques, identity and access management, and auditing mechanisms as essential components of a robust multi-cloud data security strategy.The research also evaluates cost optimization strategies, recognizing that the dynamic nature of multi-cloud pricing models can impact the overall cost of data transfer and processing. It examines approaches for workload placement, resource allocation, and predictive cost modeling to minimize operational expenses while maximizing performance.Moreover, this study provides insights into real-world case studies and best practices adopted by organizations that have successfully navigated the challenges of multi-cloud big data management. It presents a comparative analysis of various multi-cloud management platforms and tools available in the market.Keywords: multi-cloud environments, big data workloads, data transfer optimization, data processing strategies
Procedia PDF Downloads 6724671 Documents Emotions Classification Model Based on TF-IDF Weighting Measure
Authors: Amr Mansour Mohsen, Hesham Ahmed Hassan, Amira M. Idrees
Abstract:
Emotions classification of text documents is applied to reveal if the document expresses a determined emotion from its writer. As different supervised methods are previously used for emotion documents’ classification, in this research we present a novel model that supports the classification algorithms for more accurate results by the support of TF-IDF measure. Different experiments have been applied to reveal the applicability of the proposed model, the model succeeds in raising the accuracy percentage according to the determined metrics (precision, recall, and f-measure) based on applying the refinement of the lexicon, integration of lexicons using different perspectives, and applying the TF-IDF weighting measure over the classifying features. The proposed model has also been compared with other research to prove its competence in raising the results’ accuracy.Keywords: emotion detection, TF-IDF, WEKA tool, classification algorithms
Procedia PDF Downloads 48424670 Human-Centred Data Analysis Method for Future Design of Residential Spaces: Coliving Case Study
Authors: Alicia Regodon Puyalto, Alfonso Garcia-Santos
Abstract:
This article presents a method to analyze the use of indoor spaces based on data analytics obtained from inbuilt digital devices. The study uses the data generated by the in-place devices, such as smart locks, Wi-Fi routers, and electrical sensors, to gain additional insights on space occupancy, user behaviour, and comfort. Those devices, originally installed to facilitate remote operations, report data through the internet that the research uses to analyze information on human real-time use of spaces. Using an in-place Internet of Things (IoT) network enables a faster, more affordable, seamless, and scalable solution to analyze building interior spaces without incorporating external data collection systems such as sensors. The methodology is applied to a real case study of coliving, a residential building of 3000m², 7 floors, and 80 users in the centre of Madrid. The case study applies the method to classify IoT devices, assess, clean, and analyze collected data based on the analysis framework. The information is collected remotely, through the different platforms devices' platforms; the first step is to curate the data, understand what insights can be provided from each device according to the objectives of the study, this generates an analysis framework to be escalated for future building assessment even beyond the residential sector. The method will adjust the parameters to be analyzed tailored to the dataset available in the IoT of each building. The research demonstrates how human-centered data analytics can improve the future spatial design of indoor spaces.Keywords: in-place devices, IoT, human-centred data-analytics, spatial design
Procedia PDF Downloads 19724669 A Unique Multi-Class Support Vector Machine Algorithm Using MapReduce
Authors: Aditi Viswanathan, Shree Ranjani, Aruna Govada
Abstract:
With data sizes constantly expanding, and with classical machine learning algorithms that analyze such data requiring larger and larger amounts of computation time and storage space, the need to distribute computation and memory requirements among several computers has become apparent. Although substantial work has been done in developing distributed binary SVM algorithms and multi-class SVM algorithms individually, the field of multi-class distributed SVMs remains largely unexplored. This research seeks to develop an algorithm that implements the Support Vector Machine over a multi-class data set and is efficient in a distributed environment. For this, we recursively choose the best binary split of a set of classes using a greedy technique. Much like the divide and conquer approach. Our algorithm has shown better computation time during the testing phase than the traditional sequential SVM methods (One vs. One, One vs. Rest) and out-performs them as the size of the data set grows. This approach also classifies the data with higher accuracy than the traditional multi-class algorithms.Keywords: distributed algorithm, MapReduce, multi-class, support vector machine
Procedia PDF Downloads 40124668 Information Management Approach in the Prediction of Acute Appendicitis
Authors: Ahmad Shahin, Walid Moudani, Ali Bekraki
Abstract:
This research aims at presenting a predictive data mining model to handle an accurate diagnosis of acute appendicitis with patients for the purpose of maximizing the health service quality, minimizing morbidity/mortality, and reducing cost. However, acute appendicitis is the most common disease which requires timely accurate diagnosis and needs surgical intervention. Although the treatment of acute appendicitis is simple and straightforward, its diagnosis is still difficult because no single sign, symptom, laboratory or image examination accurately confirms the diagnosis of acute appendicitis in all cases. This contributes in increasing morbidity and negative appendectomy. In this study, the authors propose to generate an accurate model in prediction of patients with acute appendicitis which is based, firstly, on the segmentation technique associated to ABC algorithm to segment the patients; secondly, on applying fuzzy logic to process the massive volume of heterogeneous and noisy data (age, sex, fever, white blood cell, neutrophilia, CRP, urine, ultrasound, CT, appendectomy, etc.) in order to express knowledge and analyze the relationships among data in a comprehensive manner; and thirdly, on applying dynamic programming technique to reduce the number of data attributes. The proposed model is evaluated based on a set of benchmark techniques and even on a set of benchmark classification problems of osteoporosis, diabetes and heart obtained from the UCI data and other data sources.Keywords: healthcare management, acute appendicitis, data mining, classification, decision tree
Procedia PDF Downloads 35024667 Methodology for the Multi-Objective Analysis of Data Sets in Freight Delivery
Authors: Dale Dzemydiene, Aurelija Burinskiene, Arunas Miliauskas, Kristina Ciziuniene
Abstract:
Data flow and the purpose of reporting the data are different and dependent on business needs. Different parameters are reported and transferred regularly during freight delivery. This business practices form the dataset constructed for each time point and contain all required information for freight moving decisions. As a significant amount of these data is used for various purposes, an integrating methodological approach must be developed to respond to the indicated problem. The proposed methodology contains several steps: (1) collecting context data sets and data validation; (2) multi-objective analysis for optimizing freight transfer services. For data validation, the study involves Grubbs outliers analysis, particularly for data cleaning and the identification of statistical significance of data reporting event cases. The Grubbs test is often used as it measures one external value at a time exceeding the boundaries of standard normal distribution. In the study area, the test was not widely applied by authors, except when the Grubbs test for outlier detection was used to identify outsiders in fuel consumption data. In the study, the authors applied the method with a confidence level of 99%. For the multi-objective analysis, the authors would like to select the forms of construction of the genetic algorithms, which have more possibilities to extract the best solution. For freight delivery management, the schemas of genetic algorithms' structure are used as a more effective technique. Due to that, the adaptable genetic algorithm is applied for the description of choosing process of the effective transportation corridor. In this study, the multi-objective genetic algorithm methods are used to optimize the data evaluation and select the appropriate transport corridor. The authors suggest a methodology for the multi-objective analysis, which evaluates collected context data sets and uses this evaluation to determine a delivery corridor for freight transfer service in the multi-modal transportation network. In the multi-objective analysis, authors include safety components, the number of accidents a year, and freight delivery time in the multi-modal transportation network. The proposed methodology has practical value in the management of multi-modal transportation processes.Keywords: multi-objective, analysis, data flow, freight delivery, methodology
Procedia PDF Downloads 18024666 Clinicians’ Experiences with IT Systems in a UK District General Hospital: A Qualitative Analysis
Authors: Sunny Deo, Eve Barnes, Peter Arnold-Smith
Abstract:
Introduction: Healthcare technology is a rapidly expanding field in healthcare, with enthusiasts suggesting a revolution in the quality and efficiency of healthcare delivery based on the utilisation of better e-healthcare, including the move to paperless healthcare. The role and use of computers and programmes for healthcare have been increasing over the past 50 years. Despite this, there is no standardised method of assessing the quality of hardware and software utilised by frontline healthcare workers. Methods and subjects: Based on standard Patient Related Outcome Measures, a questionnaire was devised with the aim of providing quantitative and qualitative data on clinicians’ perspectives of their hospital’s Information Technology (IT). The survey was distributed via the Institution’s Intranet to all contracted doctors, and the survey's qualitative results were analysed. Qualitative opinions were grouped as positive, neutral, or negative and further sub-grouped into speed/usability, software/hardware, integration, IT staffing, clinical risk, and wellbeing. Analysis was undertaken on the basis of doctor seniority and by specialty. Results: There were 196 responses, with 51% from senior doctors (consultant grades) and the rest from junior grades, with the largest group of respondents 52% coming from medicine specialties. Differences in the proportion of principle and sub-groups were noted by seniority and specialty. Negative themes were by far the commonest stated opinion type, occurring in almost 2/3’s of responses (63%), while positive comments occurred less than 1 in 10 (8%). Conclusions: This survey confirms strongly negative attitudes to the current state of electronic documentation and IT in a large single-centre cohort of hospital-based frontline physicians after two decades of so-called progress to a paperless healthcare system. Greater use would provide further insights and potentially optimise the focus of development and delivery to improve the quality and effectiveness of IT for clinicians and their patients.Keywords: information technology, electronic patient records, digitisation, paperless healthcare
Procedia PDF Downloads 9224665 Minimization of Denial of Services Attacks in Vehicular Adhoc Networking by Applying Different Constraints
Authors: Amjad Khan
Abstract:
The security of Vehicular ad hoc networking is of great importance as it involves serious life threats. Thus to provide secure communication amongst Vehicles on road, the conventional security system is not enough. It is necessary to prevent the network resources from wastage and give them protection against malicious nodes so that to ensure the data bandwidth availability to the legitimate nodes of the network. This work is related to provide a non conventional security system by introducing some constraints to minimize the DoS (Denial of services) especially data and bandwidth. The data packets received by a node in the network will pass through a number of tests and if any of the test fails, the node will drop those data packets and will not forward it anymore. Also if a node claims to be the nearest node for forwarding emergency messages then the sender can effectively identify the true or false status of the claim by using these constraints. Consequently the DoS(Denial of Services) attack is minimized by the instant availability of data without wasting the network resources.Keywords: black hole attack, grey hole attack, intransient traffic tempering, networking
Procedia PDF Downloads 28424664 Traffic Prediction with Raw Data Utilization and Context Building
Authors: Zhou Yang, Heli Sun, Jianbin Huang, Jizhong Zhao, Shaojie Qiao
Abstract:
Traffic prediction is essential in a multitude of ways in modern urban life. The researchers of earlier work in this domain carry out the investigation chiefly with two major focuses: (1) the accurate forecast of future values in multiple time series and (2) knowledge extraction from spatial-temporal correlations. However, two key considerations for traffic prediction are often missed: the completeness of raw data and the full context of the prediction timestamp. Concentrating on the two drawbacks of earlier work, we devise an approach that can address these issues in a two-phase framework. First, we utilize the raw trajectories to a greater extent through building a VLA table and data compression. We obtain the intra-trajectory features with graph-based encoding and the intertrajectory ones with a grid-based model and the technique of back projection that restore their surrounding high-resolution spatial-temporal environment. To the best of our knowledge, we are the first to study direct feature extraction from raw trajectories for traffic prediction and attempt the use of raw data with the least degree of reduction. In the prediction phase, we provide a broader context for the prediction timestamp by taking into account the information that are around it in the training dataset. Extensive experiments on several well-known datasets have verified the effectiveness of our solution that combines the strength of raw trajectory data and prediction context. In terms of performance, our approach surpasses several state-of-the-art methods for traffic prediction.Keywords: traffic prediction, raw data utilization, context building, data reduction
Procedia PDF Downloads 12724663 Seismic Interpretation and Petrophysical Evaluation of SM Field, Libya
Authors: Abdalla Abdelnabi, Yousf Abushalah
Abstract:
The G Formation is a major gas producing reservoir in the SM Field, eastern, Libya. It is called G limestone because it consists of shallow marine limestone. Well data and 3D-Seismic in conjunction with the results of a previous study were used to delineate the hydrocarbon reservoir of Middle Eocene G-Formation of SM Field area. The data include three-dimensional seismic data acquired in 2009. It covers approximately an area of 75 mi² and with more than 9 wells penetrating the reservoir. Seismic data are used to identify any stratigraphic and structural and features such as channels and faults and which may play a significant role in hydrocarbon traps. The well data are used to calculation petrophysical analysis of S field. The average porosity of the Middle Eocene G Formation is very good with porosity reaching 24% especially around well W 6. Average water saturation was calculated for each well from porosity and resistivity logs using Archie’s formula. The average water saturation for the whole well is 25%. Structural mapping of top and bottom of Middle Eocene G formation revealed the highest area in the SM field is at 4800 ft subsea around wells W4, W5, W6, and W7 and the deepest point is at 4950 ft subsea. Correlation between wells using well data and structural maps created from seismic data revealed that net thickness of G Formation range from 0 ft in the north part of the field to 235 ft in southwest and south part of the field. The gas water contact is found at 4860 ft using the resistivity log. The net isopach map using both the trapezoidal and pyramid rules are used to calculate the total bulk volume. The original gas in place and the recoverable gas were calculated volumetrically to be 890 Billion Standard Cubic Feet (BSCF) and 630 (BSCF) respectively.Keywords: 3D seismic data, well logging, petrel, kingdom suite
Procedia PDF Downloads 15024662 Analysis of Spatial and Temporal Data Using Remote Sensing Technology
Authors: Kapil Pandey, Vishnu Goyal
Abstract:
Spatial and temporal data analysis is very well known in the field of satellite image processing. When spatial data are correlated with time, series analysis it gives the significant results in change detection studies. In this paper the GIS and Remote sensing techniques has been used to find the change detection using time series satellite imagery of Uttarakhand state during the years of 1990-2010. Natural vegetation, urban area, forest cover etc. were chosen as main landuse classes to study. Landuse/ landcover classes within several years were prepared using satellite images. Maximum likelihood supervised classification technique was adopted in this work and finally landuse change index has been generated and graphical models were used to present the changes.Keywords: GIS, landuse/landcover, spatial and temporal data, remote sensing
Procedia PDF Downloads 43324661 An Empirical Investigation of the Challenges of Secure Edge Computing Adoption in Organizations
Authors: Hailye Tekleselassie
Abstract:
Edge computing is a spread computing outline that transports initiative applications closer to data sources such as IoT devices or local edge servers, and possible happenstances would skull the action of new technologies. However, this investigation was attained to investigation the consciousness of technology and communications organization workers and computer users who support the service cloud. Surveys were used to achieve these objectives. Surveys were intended to attain these aims, and it is the functional using survey. Enquiries about confidence are also a key question. Problems like data privacy, integrity, and availability are the factors affecting the company’s acceptance of the service cloud.Keywords: IoT, data, security, edge computing
Procedia PDF Downloads 8324660 The Youth Employment Peculiarities in Post-Soviet Georgia
Authors: M. Lobzhanidze, N. Damenia
Abstract:
The article analyzes the current structural changes in the economy of Georgia, liberalization and integration processes of the economy. In accordance with this analysis, the peculiarities and the problems of youth employment are revealed. In the paper, the Georgian labor market and its contradictions are studied. Based on the analysis of materials, the socio-economic losses caused by the long-term and mass unemployment of young people are revealed, the objective and subjective circumstances of getting higher education are studied. The youth employment and unemployment rates are analyzed. Based on the research, the factors that increase unemployment are identified. According to the analysis of the youth employment, it has appeared that the unemployment share in the number of economically active population has increased in the younger age group. It demonstrates the high requirements of the labour market in terms of the quality of the workforce. Also, it is highlighted that young people are exposed to a highly paid job. The following research methods are applied in the presented paper: statistical (selection, grouping, observation, trend, etc.) and qualitative research (in-depth interview), as well as analysis, induction and comparison methods. The article presents the data by the National Statistics Office of Georgia and the Ministry of Agriculture of Georgia, policy documents of the Parliament of Georgia, scientific papers by Georgian and foreign scientists, analytical reports, publications and EU research materials on similar issues. The work estimates the students and graduates employment problems existing in the state development strategy and priorities. The measures to overcome the challenges are defined. The article describes the mechanisms of state regulation of youth employment and the ways of improving this regulatory base. As for major findings, it should be highlighted that the main problems are: lack of experience and incompatibility of youth qualification with the requirements of the labor market. Accordingly, it is concluded that the unemployment rate of young people in Georgia is increasing.Keywords: migration of youth, youth employment, migration management, youth employment and unemployment
Procedia PDF Downloads 14824659 Visualization as a Psychotherapeutic Mind-Body Intervention through Reducing Stress and Depression among Breast Cancer Patients in Kolkata
Authors: Prathama Guha Chaudhuri, Arunima Datta, Ashis Mukhopadhyay
Abstract:
Background: Visualization (guided imagery) is a set of techniques which induce relaxation and help people create positive mental images in order to reduce stress.It is relatively inexpensive and can even be practised by bed bound people. Studies have shown visualization to be an effective tool to improve cancer patients’ anxiety, depression and quality of life. The common images used with cancer patients in the developed world are those involving the individual’s body and its strengths. Since breast cancer patients in India are more family oriented and often their main concerns are the stigma of having cancer and subsequent isolation of their families, including their children, we figured that positive images involving acceptance and integration within family and society would be more effective for them. Method: Data was collected from 119 breast cancer patients on chemotherapy willing to undergo psychotherapy, with no history of past psychiatric illness. Their baseline stress, anxiety, depression and quality of life were assessed using validated tools. The participants were then randomly divided into three groups: a) those who received visualization therapy with standard imageries involving the body and its strengths (sVT), b) those who received visualization therapy using indigenous family oriented imageries (mVT) and c) a control group who received supportive therapy. There were six sessions spread over two months for each group. The psychological outcome variables were measured post intervention. Appropriate statistical analyses were done. Results:Both forms of visualization therapy were more effective than supportive therapy alone in reducing patients’ depression, anxiety and quality of life.Modified VT proved to be significantly more effective in improving patients’ anxiety and quality of life. Conclusion: Visualization is a valuable therapeutic option for reduction of psychological distress and improving quality of life of breast cancer patients.In order to be more effective, the images used need to be modified according to the sociocultural background and individual needs of the patients.Keywords: breast cancer, visualization therapy, quality of life, anxiety, depression
Procedia PDF Downloads 26424658 Multi Tier Data Collection and Estimation, Utilizing Queue Model in Wireless Sensor Networks
Authors: Amirhossein Mohajerzadeh, Abolghasem Mohajerzadeh
Abstract:
In this paper, target parameter is estimated with desirable precision in hierarchical wireless sensor networks (WSN) while the proposed algorithm also tries to prolong network lifetime as much as possible, using efficient data collecting algorithm. Target parameter distribution function is considered unknown. Sensor nodes sense the environment and send the data to the base station called fusion center (FC) using hierarchical data collecting algorithm. FC builds underlying phenomena based on collected data. Considering the aggregation level, x, the goal is providing the essential infrastructure to find the best value for aggregation level in order to prolong network lifetime as much as possible, while desirable accuracy is guaranteed (required sample size is fully depended on desirable precision). First, the sample size calculation algorithm is discussed, second, the average queue length based on M/M[x]/1/K queue model is determined and it is used for energy consumption calculation. Nodes can decrease transmission cost by aggregating incoming data. Furthermore, the performance of the new algorithm is evaluated in terms of lifetime and estimation accuracy.Keywords: aggregation, estimation, queuing, wireless sensor network
Procedia PDF Downloads 18624657 Inversion of Electrical Resistivity Data: A Review
Authors: Shrey Sharma, Gunjan Kumar Verma
Abstract:
High density electrical prospecting has been widely used in groundwater investigation, civil engineering and environmental survey. For efficient inversion, the forward modeling routine, sensitivity calculation, and inversion algorithm must be efficient. This paper attempts to provide a brief summary of the past and ongoing developments of the method. It includes reviews of the procedures used for data acquisition, processing and inversion of electrical resistivity data based on compilation of academic literature. In recent times there had been a significant evolution in field survey designs and data inversion techniques for the resistivity method. In general 2-D inversion for resistivity data is carried out using the linearized least-square method with the local optimization technique .Multi-electrode and multi-channel systems have made it possible to conduct large 2-D, 3-D and even 4-D surveys efficiently to resolve complex geological structures that were not possible with traditional 1-D surveys. 3-D surveys play an increasingly important role in very complex areas where 2-D models suffer from artifacts due to off-line structures. Continued developments in computation technology, as well as fast data inversion techniques and software, have made it possible to use optimization techniques to obtain model parameters to a higher accuracy. A brief discussion on the limitations of the electrical resistivity method has also been presented.Keywords: inversion, limitations, optimization, resistivity
Procedia PDF Downloads 36524656 Exploring the Correlation between Population Distribution and Urban Heat Island under Urban Data: Taking Shenzhen Urban Heat Island as an Example
Authors: Wang Yang
Abstract:
Shenzhen is a modern city of China's reform and opening-up policy, the development of urban morphology has been established on the administration of the Chinese government. This city`s planning paradigm is primarily affected by the spatial structure and human behavior. The subjective urban agglomeration center is divided into several groups and centers. In comparisons of this effect, the city development law has better to be neglected. With the continuous development of the internet, extensive data technology has been introduced in China. Data mining and data analysis has become important tools in municipal research. Data mining has been utilized to improve data cleaning such as receiving business data, traffic data and population data. Prior to data mining, government data were collected by traditional means, then were analyzed using city-relationship research, delaying the timeliness of urban development, especially for the contemporary city. Data update speed is very fast and based on the Internet. The city's point of interest (POI) in the excavation serves as data source affecting the city design, while satellite remote sensing is used as a reference object, city analysis is conducted in both directions, the administrative paradigm of government is broken and urban research is restored. Therefore, the use of data mining in urban analysis is very important. The satellite remote sensing data of the Shenzhen city in July 2018 were measured by the satellite Modis sensor and can be utilized to perform land surface temperature inversion, and analyze city heat island distribution of Shenzhen. This article acquired and classified the data from Shenzhen by using Data crawler technology. Data of Shenzhen heat island and interest points were simulated and analyzed in the GIS platform to discover the main features of functional equivalent distribution influence. Shenzhen is located in the east-west area of China. The city’s main streets are also determined according to the direction of city development. Therefore, it is determined that the functional area of the city is also distributed in the east-west direction. The urban heat island can express the heat map according to the functional urban area. Regional POI has correspondence. The research result clearly explains that the distribution of the urban heat island and the distribution of urban POIs are one-to-one correspondence. Urban heat island is primarily influenced by the properties of the underlying surface, avoiding the impact of urban climate. Using urban POIs as analysis object, the distribution of municipal POIs and population aggregation are closely connected, so that the distribution of the population corresponded with the distribution of the urban heat island.Keywords: POI, satellite remote sensing, the population distribution, urban heat island thermal map
Procedia PDF Downloads 10424655 Towards a Model of Support in the Areas of Services of Educational Assistance and Tutoring in Middle Education in Mexico
Authors: Margarita Zavala, Julio Rolón, Gabriel Chavira, José González, Jorge Orozco, Roberto Pichardo
Abstract:
Adolescence is a neuralgic stage in the formation of every human being, generally at this stage is when the middle school level is studied. In 2006 in Mexico incorporated “mentoring" space to assist students in their integration and participation in life. In public middle schools, is sometimes difficult to be aware of situations that affect students because of the number of them and traditional records management. Whit this they lose the opportunity to provide timely support as a preventive way. In order to provide this support, it is required to know the students by detecting the relevant information that has greater impact on their learning process. This research is looking to check if it is possible to identify student’s relevant information to detect when it is at risk, and then to propose a model to manage in a proper way such information.Keywords: adolescence, mentoring, middle school students, mentoring system support
Procedia PDF Downloads 42024654 Project Objective Structure Model: An Integrated, Systematic and Balanced Approach in Order to Achieve Project Objectives
Authors: Mohammad Reza Oftadeh
Abstract:
The purpose of the article is to describe project objective structure (POS) concept that was developed on research activities and experiences about project management, Balanced Scorecard (BSC) and European Foundation Quality Management Excellence Model (EFQM Excellence Model). Furthermore, this paper tries to define a balanced, systematic, and integrated measurement approach to meet project objectives and project strategic goals based on a process-oriented model. In this paper, POS is suggested in order to measure project performance in the project life cycle. After using the POS model, the project manager can ensure in order to achieve the project objectives on the project charter. This concept can help project managers to implement integrated and balanced monitoring and control project work.Keywords: project objectives, project performance management, PMBOK, key performance indicators, integration management
Procedia PDF Downloads 37824653 A Proposal of Ontology about Brazilian Government Transparency Portal
Authors: Estela Mayra de Moura Vianna, Thiago José Tavares Ávila, Bruno Morais Silva, Diego Henrique Bezerra, Paulo Henrique Gomes Silva, Alan Pedro da Silva
Abstract:
The Brazilian Federal Constitution defines the access to information as a crucial right of the citizen and the Law on Access to Public Information, which regulates this right. Accordingly, the Fiscal Responsibility Act, 2000, amended in 2009 by the “Law of Transparency”, began demanding a wider disclosure of public accounts for the society, including electronic media for public access. Thus, public entities began to create "Transparency Portals," which aim to gather a diversity of data and information. However, this information, in general, is still published in formats that do not simplify understanding of the data by citizens and that could be better especially available for audit purposes. In this context, a proposal of ontology about Brazilian Transparency Portal can play a key role in how these data will be better available. This study aims to identify and implement in ontology, the data model about Transparency Portal ecosystem, with emphasis in activities that use these data for some applications, like audits, press activities, social government control, and others.Keywords: audit, government transparency, ontology, public sector
Procedia PDF Downloads 50624652 Nigeria’s Terrorists RehabIlitation And Reintegration Policy: A Victimological Perspective
Authors: Ujene Ikem Godspower
Abstract:
Acts of terror perpetrated either by state or non-state actors are considered a social ill and impugn on the collective well-being of the society. As such, there is the need for social reparations, which is meant to ensure the healing of the social wounds resulting from the atrocities committed by errant individuals under different guises. In order to ensure social closure and effectively repair the damages done by anomic behaviors, society must ensure that justice is served and those whose rights and privileges have been denied and battered are given the necessary succour they deserve. With regards to the ongoing terrorism in the Northeast, the moves to rehabilitate and reintegrate Boko Haram members have commenced with the establishment of Operation Safe Corridor,1 and a proposed bill for the establishment of “National Agency for the Education, Rehabilitation, De-radicalisation and Integration of Repentant Insurgents in Nigeria”2. All of which Nigerians have expressed mixed feelings about. Some argue that the endeavor is lacking in ethical decency and justice and totally insults human reasoning. Terrorism and counterterrorism in Nigeria have been enmeshed in gross human rights violations both by the military and the terrorists, and this raises the concern of Nigeria’s ability to fairly and justiciably implement the deradicalization and reintegration efforts. On the other hand, there is the challenge of the community dwellers that are victims of terrorism and counterterrorism and their ability to forgive and welcome back their immediate-past tormentors even with the slightest sense of injustice in the process of terrorists reintegration and rehabilitation. With such efforts implemented in other climes, the Nigeria’s case poses a unique challenge and commands keen interests by stakeholders and the international community due to the aforementioned reasons. It is therefore pertinent to assess the communities’ level of involvement in the cycle of reintegration- hence, the objective of this paper. Methodologically as a part of my larger PhD thesis, this study intends to explore the three different local governments (Michika in Adamawa, Chibok in Borno, and Yunusari in Yobe), all based on the intensity of terrorists attacks. Twenty five in-depth interview will be conducted in the study locations above featuring religious leaders, Community (traditional) leaders, Internally displaced persons, CSOs management officials, and ex-Boko Haram insurgents who have been reintegrated. The data that will be generated from field work will be analyzed using the Nvivo-12 software package, which will help to code and create themes based on the study objectives. Furthermore, the data will be content-analyzed, employing verbatim quotations where necessary. Ethically, the study will take into consideration the basic ethical principles for research of this nature. It will strictly adhere to the principle of voluntary participation, anonymity, and confidentiality.Keywords: boko haram, reintegration, rehabilitation, terrorism, victimology
Procedia PDF Downloads 24624651 Collective Efficacy and Rural Migration in Urban China—Social Determinants on Urbanization, Social Integration and Civic Engagement
Authors: Ziwei Qi
Abstract:
This paper focuses on issues on Urbanization, Rural Migration and Neighborhood Collective Efficacy in urban China. The urbanization and migration trend and policies in China will be discussed and the various mechanisms through which social structures affect economic action and the consequent of social disequilibrium due to urbanization will be discussed. The positive and negative propositions on urbanization will also be highlighted. The primary methodologies applied in the paper will be the theoretical application and empirical implication on urbanization in developing countries. Western sociological theories, including theories in urban criminology /sociology including social disorganization, theories of social capital and collective efficacy will be applied and analyzed to test the market society in Chinese economic and cultural setting.Keywords: collective efficacy, civic engagement, rural migration, urbanization
Procedia PDF Downloads 33124650 Design and Development of Data Mining Application for Medical Centers in Remote Areas
Authors: Grace Omowunmi Soyebi
Abstract:
Data Mining is the extraction of information from a large database which helps in predicting a trend or behavior, thereby helping management make knowledge-driven decisions. One principal problem of most hospitals in rural areas is making use of the file management system for keeping records. A lot of time is wasted when a patient visits the hospital, probably in an emergency, and the nurse or attendant has to search through voluminous files before the patient's file can be retrieved; this may cause an unexpected to happen to the patient. This Data Mining application is to be designed using a Structured System Analysis and design method, which will help in a well-articulated analysis of the existing file management system, feasibility study, and proper documentation of the Design and Implementation of a Computerized medical record system. This Computerized system will replace the file management system and help to easily retrieve a patient's record with increased data security, access clinical records for decision-making, and reduce the time range at which a patient gets attended to.Keywords: data mining, medical record system, systems programming, computing
Procedia PDF Downloads 20924649 A Comprehensive Framework to Ensure Data Security in Cloud Computing: Analysis, Solutions, and Approaches
Authors: Loh Fu Quan, Fong Zi Heng, Burra Venkata Durga Kumar
Abstract:
Cloud computing has completely transformed the way many businesses operate. Traditionally, confidential data of a business is stored in computers located within the premise of the business. Therefore, a lot of business capital is put towards maintaining computing resources and hiring IT teams to manage them. The advent of cloud computing changes everything. Instead of purchasing and managing their infrastructure, many businesses have started to shift towards working with the cloud with the help of a cloud service provider (CSP), leading to cost savings. However, it also introduces security risks. This research paper focuses on the security risks that arise during data migration and user authentication in cloud computing. To overcome this problem, this paper provides a comprehensive framework that includes Transport Layer Security (TLS), user authentication, security tokens and multi-level data encryption. This framework aims to prevent authorized access to cloud resources and data leakage, ensuring the confidentiality of sensitive information. This framework can be used by cloud service providers to strengthen the security of their cloud and instil confidence in their users.Keywords: Cloud computing, Cloud security, Cloud security issues, Cloud security framework
Procedia PDF Downloads 12024648 Using AI for Analysing Political Leaders
Authors: Shuai Zhao, Shalendra D. Sharma, Jin Xu
Abstract:
This research uses advanced machine learning models to learn a number of hypotheses regarding political executives. Specifically, it analyses the impact these powerful leaders have on economic growth by using leaders’ data from the Archigos database from 1835 to the end of 2015. The data is processed by the AutoGluon, which was developed by Amazon. Automated Machine Learning (AutoML) and AutoGluon can automatically extract features from the data and then use multiple classifiers to train the data. Use a linear regression model and classification model to establish the relationship between leaders and economic growth (GDP per capita growth), and to clarify the relationship between their characteristics and economic growth from a machine learning perspective. Our work may show as a model or signal for collaboration between the fields of statistics and artificial intelligence (AI) that can light up the way for political researchers and economists.Keywords: comparative politics, political executives, leaders’ characteristics, artificial intelligence
Procedia PDF Downloads 8624647 A Study on the Wind Energy Produced in the Building Skin Using Piezoelectricity
Authors: Sara Mota Carmo
Abstract:
Nowadays, there is an increasing demand for buildings to be energetically autonomous through energy generation systems from renewable sources, according to the concept of a net zero energy building (NZEB). In this sense, the present study aims to study the integration of wind energy through piezoelectricity applied to the building skin. As a methodology, a reduced-scale prototype of a building was developed and tested in a wind tunnel, with the four façades monitored by recording the energy produced by each. The applied wind intensities varied between 2m/s and 8m/s and the four façades were compared with each other regarding the energy produced according to the intensity of wind and position in the wind. The results obtained concluded that it was not a sufficient system to generate sources to cover family residential buildings' energy needs. However, piezoelectricity is expanding and can be a promising path for a wind energy system in architecture as a complement to other renewable energy sources.Keywords: adaptative building skin, kinetic façade, wind energy in architecture, NZEB
Procedia PDF Downloads 7624646 Data Quality on Regular Immunization Programme at Birkod District: Somali Region, Ethiopia
Authors: Eyob Seife, Tesfalem Teshome, Bereket Seyoum, Behailu Getachew, Yohans Demis
Abstract:
Developing countries continue to face preventable communicable diseases, such as vaccine-preventable diseases. The Expanded Programme on Immunization (EPI) was established by the World Health Organization in 1974 to control these diseases. Health data use is crucial in decision-making, but ensuring data quality remains challenging. The study aimed to assess the accuracy ratio, timeliness, and quality index of regular immunization programme data in the Birkod district of the Somali Region, Ethiopia. For poor data quality, technical, contextual, behavioral, and organizational factors are among contributors. The study used a quantitative cross-sectional design conducted in September 2022GC using WHO-recommended data quality self-assessment tools. The accuracy ratio and timeliness of reports on regular immunization programmes were assessed for two health centers and three health posts in the district for one fiscal year. Moreover, the quality index assessment was conducted at the district level and health facilities by trained assessors. The study found poor data quality in the accuracy ratio and timeliness of reports at all health units, which includes zeros. Overreporting was observed for most facilities, particularly at the health post level. Health centers showed a relatively better accuracy ratio than health posts. The quality index assessment revealed poor quality at all levels. The study recommends that responsible bodies at different levels improve data quality using various approaches, such as the capacitation of health professionals and strengthening the quality index components. The study highlighted the need for attention to data quality in general, specifically at the health post level, and improving the quality index at all levels, which is essential.Keywords: Birkod District, data quality, quality index, regular immunization programme, Somali Region-Ethiopia
Procedia PDF Downloads 9024645 The Results of Longitudinal Water Quality Monitoring of the Brandywine River, Chester County, Pennsylvania by High School Students
Authors: Dina L. DiSantis
Abstract:
Strengthening a sense of responsibility while relating global sustainability concepts such as water quality and pollution to a local water system can be achieved by teaching students to conduct and interpret water quality monitoring tests. When students conduct their own research, they become better stewards of the environment. Providing outdoor learning and place-based opportunities for students helps connect them to the natural world. By conducting stream studies and collecting data, students are able to better understand how the natural environment is a place where everything is connected. Students have been collecting physical, chemical and biological data along the West and East Branches of the Brandywine River, in Pennsylvania for over ten years. The stream studies are part of the advanced placement environmental science and aquatic science courses that are offered as electives to juniors and seniors at the Downingtown High School West Campus in Downingtown, Pennsylvania. Physical data collected includes: temperature, turbidity, width, depth, velocity, and volume of flow or discharge. The chemical tests conducted are: dissolved oxygen, carbon dioxide, pH, nitrates, alkalinity and phosphates. Macroinvertebrates are collected with a kick net, identified and then released. Students collect the data from several locations while traveling by canoe. In the classroom, students prepare a water quality data analysis and interpretation report based on their collected data. The summary of the results from longitudinal water quality data collection by students, as well as the strengths and weaknesses of student data collection will be presented.Keywords: place-based, student data collection, sustainability, water quality monitoring
Procedia PDF Downloads 15624644 Content and Langauge Integrated Learning: English and Art History
Authors: Craig Mertens
Abstract:
Teaching art history or any other academic subject to EFL students can be done successfully. A course called Western Images was created to teach Japanese students art history while only using English in the classroom. An approach known as Content and Language Integrated Learning (CLIL) was used as a basis for this course. This paper’s purpose is to state the reasons why learning about art history is important, go through the process of creating content for the course, and suggest multiple tasks to help students practice the critical thinking skills used in analyzing and drawing conclusions of works of art from western culture. As a guide for this paper, Brown’s (1995) six elements of a language curriculum will be used. These stages include needs analysis, goals and objectives, assessment, materials, teaching method and tasks, and evaluation of the course. The goal here is to inspire debate and discussion regarding CLIL and its pros and cons, and to question current curriculum in university language courses.Keywords: art history, EFL, content and language integration learning, critical thinking
Procedia PDF Downloads 597