Search results for: link data
25096 Enabling Quantitative Urban Sustainability Assessment with Big Data
Authors: Changfeng Fu
Abstract:
Sustainable urban development has been widely accepted a common sense in the modern urban planning and design. However, the measurement and assessment of urban sustainability, especially the quantitative assessment have been always an issue obsessing planning and design professionals. This paper will present an on-going research on the principles and technologies to develop a quantitative urban sustainability assessment principles and techniques which aim to integrate indicators, geospatial and geo-reference data, and assessment techniques together into a mechanism. It is based on the principles and techniques of geospatial analysis with GIS and statistical analysis methods. The decision-making technologies and methods such as AHP and SMART are also adopted to address overall assessment conclusions. The possible interfaces and presentation of data and quantitative assessment results are also described. This research is based on the knowledge, situations and data sources of UK, but it is potentially adaptable to other countries or regions. The implementation potentials of the mechanism are also discussed.Keywords: urban sustainability assessment, quantitative analysis, sustainability indicator, geospatial data, big data
Procedia PDF Downloads 35925095 Users’ Information Disclosure Determinants in Social Networking Sites: A Systematic Literature Review
Authors: Wajdan Al Malwi, Karen Renaud, Lewis Mackenzie
Abstract:
The privacy paradox describes a phenomenon whereby there is no connection between stated privacy concerns and privacy behaviours. We need to understand the underlying reasons for this paradox if we are to help users to preserve their privacy more effectively. In particular, the Social Networking System (SNS) domain offers a rich area of investigation due to the risks of unwise information disclosure decisions. Our study thus aims to untangle the complicated nature and underlying mechanisms of online privacy-related decisions in SNSs. In this paper, we report on the findings of a Systematic Literature Review (SLR) that revealed a number of factors that are likely to influence online privacy decisions. Our deductive analysis approach was informed by Communicative Privacy Management (CPM) theory. We uncovered a lack of clarity around privacy attitudes and their link to behaviours, which makes it challenging to design privacy-protecting SNS platforms and to craft legislation to ensure that users’ privacy is preserved.Keywords: privacy paradox, self-disclosure, privacy attitude, privacy behavior, social networking sites
Procedia PDF Downloads 15825094 X-Ray Crystallographic Studies on BPSL2418 from Burkholderia pseudomallei
Authors: Mona Alharbi
Abstract:
Melioidosis has emerged as a lethal disease. Unfortunately, the molecular mechanisms of virulence and pathogenicity of Burkholderia pseudomallei remain unknown. However, proteomics research has selected putative targets in B. pseudomallei that might play roles in the B. pseudomallei virulence. BPSL 2418 putative protein has been predicted as a free methionine sulfoxide reductase and interestingly there is a link between the level of the methionine sulfoxide in pathogen tissues and its virulence. Therefore in this work, we describe the cloning expression, purification, and crystallization of BPSL 2418 and the solution of its 3D structure using X-ray crystallography. Also, we aimed to identify the substrate binding and reduced forms of the enzyme to understand the role of BPSL 2418. The gene encoding BPSL2418 from B. pseudomallei was amplified by PCR and reclone in pETBlue-1 vector and transformed into E. coli Tuner DE3 pLacI. BPSL2418 was overexpressed using E. coli Tuner DE3 pLacI and induced by 300μM IPTG for 4h at 37°C. Then BPS2418 purified to better than 95% purity. The pure BPSL2418 was crystallized with PEG 4000 and PEG 6000 as precipitants in several conditions. Diffraction data were collected to 1.2Å resolution. The crystals belonged to space group P2 21 21 with unit-cell parameters a = 42.24Å, b = 53.48Å, c = 60.54Å, α=γ=β= 90Å. The BPSL2418 binding MES was solved by molecular replacement with the known structure 3ksf using PHASER program. The structure is composed of six antiparallel β-strands and four α-helices and two loops. BPSL2418 shows high homology with the GAF domain fRMsrs enzymes which suggest that BPSL2418 might act as methionine sulfoxide reductase. The amino acids alignment between the fRmsrs including BPSL 2418 shows that the three cysteines that thought to catalyze the reduction are fully conserved. BPSL 2418 contains the three conserved cysteines (Cys⁷⁵, Cys⁸⁵ and Cys¹⁰⁹). The active site contains the six antiparallel β-strands and two loops where the disulfide bond formed between Cys⁷⁵ and Cys¹⁰⁹. X-ray structure of free methionine sulfoxide binding and native forms of BPSL2418 were solved to increase the understanding of the BPSL2418 catalytic mechanism.Keywords: X-Ray Crystallography, BPSL2418, Burkholderia pseudomallei, Melioidosis
Procedia PDF Downloads 24825093 Implementation in Python of a Method to Transform One-Dimensional Signals in Graphs
Authors: Luis Andrey Fajardo Fajardo
Abstract:
We are immersed in complex systems. The human brain, the galaxies, the snowflakes are examples of complex systems. An area of interest in Complex systems is the chaos theory. This revolutionary field of science presents different ways of study than determinism and reductionism. Here is where in junction with the Nonlinear DSP, chaos theory offer valuable techniques that establish a link between time series and complex theory in terms of complex networks, so that, the study of signals can be explored from the graph theory. Recently, some people had purposed a method to transform time series in graphs, but no one had developed a suitable implementation in Python with signals extracted from Chaotic Systems or Complex systems. That’s why the implementation in Python of an existing method to transform one dimensional chaotic signals from time domain to graph domain and some measures that may reveal information not extracted in the time domain is proposed.Keywords: Python, complex systems, graph theory, dynamical systems
Procedia PDF Downloads 51125092 Development of Generalized Correlation for Liquid Thermal Conductivity of N-Alkane and Olefin
Authors: A. Ishag Mohamed, A. A. Rabah
Abstract:
The objective of this research is to develop a generalized correlation for the prediction of thermal conductivity of n-Alkanes and Alkenes. There is a minority of research and lack of correlation for thermal conductivity of liquids in the open literature. The available experimental data are collected covering the groups of n-Alkanes and Alkenes.The data were assumed to correlate to temperature using Filippov correlation. Nonparametric regression of Grace Algorithm was used to develop the generalized correlation model. A spread sheet program based on Microsoft Excel was used to plot and calculate the value of the coefficients. The results obtained were compared with the data that found in Perry's Chemical Engineering Hand Book. The experimental data correlated to the temperature ranged "between" 273.15 to 673.15 K, with R2 = 0.99.The developed correlation reproduced experimental data that which were not included in regression with absolute average percent deviation (AAPD) of less than 7 %. Thus the spread sheet was quite accurate which produces reliable data.Keywords: N-Alkanes, N-Alkenes, nonparametric, regression
Procedia PDF Downloads 65425091 Implementation of MPPT Algorithm for Grid Connected PV Module with IC and P&O Method
Authors: Arvind Kumar, Manoj Kumar, Dattatraya H. Nagaraj, Amanpreet Singh, Jayanthi Prattapati
Abstract:
In recent years, the use of renewable energy resources instead of pollutant fossil fuels and other forms has increased. Photovoltaic generation is becoming increasingly important as a renewable resource since it does not cause in fuel costs, pollution, maintenance, and emitting noise compared with other alternatives used in power applications. In this paper, Perturb and Observe and Incremental Conductance methods are used to improve energy conversion efficiency under different environmental conditions. PI controllers are used to control easily DC-link voltage, active and reactive currents. The whole system is simulated under standard climatic conditions (1000 W/m2, 250C) in MATLAB and the irradiance is varied from 1000 W/m2 to 300 W/m2. The use of PI controller makes it easy to directly control the power of the grid connected PV system. Finally the validity of the system will be verified through the simulations in MATLAB/Simulink environment.Keywords: incremental conductance algorithm, modeling of PV panel, perturb and observe algorithm, photovoltaic system and simulation results
Procedia PDF Downloads 50925090 Survey on Arabic Sentiment Analysis in Twitter
Authors: Sarah O. Alhumoud, Mawaheb I. Altuwaijri, Tarfa M. Albuhairi, Wejdan M. Alohaideb
Abstract:
Large-scale data stream analysis has become one of the important business and research priorities lately. Social networks like Twitter and other micro-blogging platforms hold an enormous amount of data that is large in volume, velocity and variety. Extracting valuable information and trends out of these data would aid in a better understanding and decision-making. Multiple analysis techniques are deployed for English content. Moreover, one of the languages that produce a large amount of data over social networks and is least analyzed is the Arabic language. The proposed paper is a survey on the research efforts to analyze the Arabic content in Twitter focusing on the tools and methods used to extract the sentiments for the Arabic content on Twitter.Keywords: big data, social networks, sentiment analysis, twitter
Procedia PDF Downloads 58025089 Estimating Current Suicide Rates Using Google Trends
Authors: Ladislav Kristoufek, Helen Susannah Moat, Tobias Preis
Abstract:
Data on the number of people who have committed suicide tends to be reported with a substantial time lag of around two years. We examine whether online activity measured by Google searches can help us improve estimates of the number of suicide occurrences in England before official figures are released. Specifically, we analyse how data on the number of Google searches for the terms “depression” and “suicide” relate to the number of suicides between 2004 and 2013. We find that estimates drawing on Google data are significantly better than estimates using previous suicide data alone. We show that a greater number of searches for the term “depression” is related to fewer suicides, whereas a greater number of searches for the term “suicide” is related to more suicides. Data on suicide related search behaviour can be used to improve current estimates of the number of suicide occurrences.Keywords: nowcasting, search data, Google Trends, official statistics
Procedia PDF Downloads 36025088 On the Network Packet Loss Tolerance of SVM Based Activity Recognition
Authors: Gamze Uslu, Sebnem Baydere, Alper K. Demir
Abstract:
In this study, data loss tolerance of Support Vector Machines (SVM) based activity recognition model and multi activity classification performance when data are received over a lossy wireless sensor network is examined. Initially, the classification algorithm we use is evaluated in terms of resilience to random data loss with 3D acceleration sensor data for sitting, lying, walking and standing actions. The results show that the proposed classification method can recognize these activities successfully despite high data loss. Secondly, the effect of differentiated quality of service performance on activity recognition success is measured with activity data acquired from a multi hop wireless sensor network, which introduces high data loss. The effect of number of nodes on the reliability and multi activity classification success is demonstrated in simulation environment. To the best of our knowledge, the effect of data loss in a wireless sensor network on activity detection success rate of an SVM based classification algorithm has not been studied before.Keywords: activity recognition, support vector machines, acceleration sensor, wireless sensor networks, packet loss
Procedia PDF Downloads 47725087 GIS Data Governance: GIS Data Submission Process for Build-in Project, Replacement Project at Oman electricity Transmission Company
Authors: Rahma Saleh Hussein Al Balushi
Abstract:
Oman Electricity Transmission Company's (OETC) vision is to be a renowned world-class transmission grid by 2025, and one of the indications of achieving the vision is obtaining Asset Management ISO55001 certification, which required setting out a documented Standard Operating Procedures (SOP). Hence, documented SOP for the Geographical information system data process has been established. Also, to effectively manage and improve OETC power transmission, asset data and information need to be governed as such by Asset Information & GIS department. This paper will describe in detail the current GIS data submission process and the journey for developing it. The methodology used to develop the process is based on three main pillars, which are system and end-user requirements, Risk evaluation, data availability, and accuracy. The output of this paper shows the dramatic change in the used process, which results subsequently in more efficient, accurate, and updated data. Furthermore, due to this process, GIS has been and is ready to be integrated with other systems as well as the source of data for all OETC users. Some decisions related to issuing No objection certificates (NOC) for excavation permits and scheduling asset maintenance plans in Computerized Maintenance Management System (CMMS) have been made consequently upon GIS data availability. On the Other hand, defining agreed and documented procedures for data collection, data systems update, data release/reporting and data alterations has also contributed to reducing the missing attributes and enhance data quality index of GIS transmission data. A considerable difference in Geodatabase (GDB) completeness percentage was observed between the years 2017 and year 2022. Overall, concluding that by governance, asset information & GIS department can control the GIS data process; collect, properly record, and manage asset data and information within the OETC network. This control extends to other applications and systems integrated with/related to GIS systems.Keywords: asset management ISO55001, standard procedures process, governance, CMMS
Procedia PDF Downloads 12525086 Modal Analysis of Small Frames using High Order Timoshenko Beams
Authors: Chadi Azoury, Assad Kallassy, Pierre Rahme
Abstract:
In this paper, we consider the modal analysis of small frames. Firstly, we construct the 3D model using H8 elements and find the natural frequencies of the frame focusing our attention on the modes in the XY plane. Secondly, we construct the 2D model (plane stress model) using Q4 elements. We concluded that the results of both models are very close to each other’s. Then we formulate the stiffness matrix and the mass matrix of the 3-noded Timoshenko beam that is well suited for thick and short beams like in our case. Finally, we model the corners where the horizontal and vertical bar meet with a special matrix. The results of our new model (3-noded Timoshenko beam for the horizontal and vertical bars and a special element for the corners based on the Q4 elements) are very satisfying when performing the modal analysis.Keywords: corner element, high-order Timoshenko beam, Guyan reduction, modal analysis of frames, rigid link, shear locking, and short beams
Procedia PDF Downloads 32025085 Efects of Data Corelation in a Sparse-View Compresive Sensing Based Image Reconstruction
Authors: Sajid Abas, Jon Pyo Hong, Jung-Ryun Le, Seungryong Cho
Abstract:
Computed tomography and laminography are heavily investigated in a compressive sensing based image reconstruction framework to reduce the dose to the patients as well as to the radiosensitive devices such as multilayer microelectronic circuit boards. Nowadays researchers are actively working on optimizing the compressive sensing based iterative image reconstruction algorithm to obtain better quality images. However, the effects of the sampled data’s properties on reconstructed the image’s quality, particularly in an insufficient sampled data conditions have not been explored in computed laminography. In this paper, we investigated the effects of two data properties i.e. sampling density and data incoherence on the reconstructed image obtained by conventional computed laminography and a recently proposed method called spherical sinusoidal scanning scheme. We have found that in a compressive sensing based image reconstruction framework, the image quality mainly depends upon the data incoherence when the data is uniformly sampled.Keywords: computed tomography, computed laminography, compressive sending, low-dose
Procedia PDF Downloads 46425084 Bank Failures: A Question of Leadership
Authors: Alison L. Miles
Abstract:
Almost all major financial institutions in the world suffered losses due to the financial crisis of 2007, but the extent varied widely. The causes of the crash of 2007 are well documented and predominately focus on the role and complexity of the financial markets. The dominant theme of the literature suggests the causes of the crash were a combination of globalization, financial sector innovation, moribund regulation and short termism. While these arguments are undoubtedly true, they do not tell the whole story. A key weakness in the current analysis is the lack of consideration of those leading the banks pre and during times of crisis. This purpose of this study is to examine the possible link between the leadership styles and characteristics of the CEO, CFO and chairman and the financial institutions that failed or needed recapitalization. As such, it contributes to the literature and debate on international financial crises and systemic risk and also to the debate on risk management and regulatory reform in the banking sector. In order to first test the proposition (p1) that there are prevalent leadership characteristics or traits in financial institutions, an initial study was conducted using a sample of the top 65 largest global banks and financial institutions according to the Banker Top 1000 banks 2014. Secondary data from publically available and official documents, annual reports, treasury and parliamentary reports together with a selection of press articles and analyst meeting transcripts was collected longitudinally from the period 1998 to 2013. A computer aided key word search was used in order to identify the leadership styles and characteristics of the chairman, CEO and CFO. The results were then compared with the leadership models to form a picture of leadership in the sector during the research period. As this resulted in separate results that needed combining, SPSS data editor was used to aggregate the results across the studies using the variables ‘leadership style’ and ‘company financial performance’ together with the size of the company. In order to test the proposition (p2) that there was a prevalent leadership style in the banks that failed and the proposition (P3) that this was different to those that did not, further quantitative analysis was carried out on the leadership styles of the chair, CEO and CFO of banks that needed recapitalization, were taken over, or required government bail-out assistance during 2007-8. These included: Lehman Bros, Merrill Lynch, Royal Bank of Scotland, HBOS, Barclays, Northern Rock, Fortis and Allied Irish. The findings show that although regulatory reform has been a key mechanism of control of behavior in the banking sector, consideration of the leadership characteristics of those running the board are a key factor. They add weight to the argument that if each crisis is met with the same pattern of popular fury with the financier, increased regulation, followed by back to business as usual, the cycle of failure will always be repeated and show that through a different lens, new paradigms can be formed and future clashes avoided.Keywords: banking, financial crisis, leadership, risk
Procedia PDF Downloads 31825083 Fuzzy Wavelet Model to Forecast the Exchange Rate of IDR/USD
Authors: Tri Wijayanti Septiarini, Agus Maman Abadi, Muhammad Rifki Taufik
Abstract:
The exchange rate of IDR/USD can be the indicator to analysis Indonesian economy. The exchange rate as a important factor because it has big effect in Indonesian economy overall. So, it needs the analysis data of exchange rate. There is decomposition data of exchange rate of IDR/USD to be frequency and time. It can help the government to monitor the Indonesian economy. This method is very effective to identify the case, have high accurate result and have simple structure. In this paper, data of exchange rate that used is weekly data from December 17, 2010 until November 11, 2014.Keywords: the exchange rate, fuzzy mamdani, discrete wavelet transforms, fuzzy wavelet
Procedia PDF Downloads 57425082 Humanising Digital Healthcare to Build Capacity by Harnessing the Power of Patient Data
Authors: Durhane Wong-Rieger, Kawaldip Sehmi, Nicola Bedlington, Nicole Boice, Tamás Bereczky
Abstract:
Patient-generated health data should be seen as the expression of the experience of patients, including the outcomes reflecting the impact a treatment or service had on their physical health and wellness. We discuss how the healthcare system can reach a place where digital is a determinant of health - where data is generated by patients and is respected and which acknowledges their contribution to science. We explore the biggest barriers facing this. The International Experience Exchange with Patient Organisation’s Position Paper is based on a global patient survey conducted in Q3 2021 that received 304 responses. Results were discussed and validated by the 15 patient experts and supplemented with literature research. Results are a subset of this. Our research showed patient communities want to influence how their data is generated, shared, and used. Our study concludes that a reasonable framework is needed to protect the integrity of patient data and minimise abuse, and build trust. Results also demonstrated a need for patient communities to have more influence and control over how health data is generated, shared, and used. The results clearly highlight that the community feels there is a lack of clear policies on sharing data.Keywords: digital health, equitable access, humanise healthcare, patient data
Procedia PDF Downloads 8425081 Use of Machine Learning in Data Quality Assessment
Authors: Bruno Pinto Vieira, Marco Antonio Calijorne Soares, Armando Sérgio de Aguiar Filho
Abstract:
Nowadays, a massive amount of information has been produced by different data sources, including mobile devices and transactional systems. In this scenario, concerns arise on how to maintain or establish data quality, which is now treated as a product to be defined, measured, analyzed, and improved to meet consumers' needs, which is the one who uses these data in decision making and companies strategies. Information that reaches low levels of quality can lead to issues that can consume time and money, such as missed business opportunities, inadequate decisions, and bad risk management actions. The step of selecting, identifying, evaluating, and selecting data sources with significant quality according to the need has become a costly task for users since the sources do not provide information about their quality. Traditional data quality control methods are based on user experience or business rules limiting performance and slowing down the process with less than desirable accuracy. Using advanced machine learning algorithms, it is possible to take advantage of computational resources to overcome challenges and add value to companies and users. In this study, machine learning is applied to data quality analysis on different datasets, seeking to compare the performance of the techniques according to the dimensions of quality assessment. As a result, we could create a ranking of approaches used, besides a system that is able to carry out automatically, data quality assessment.Keywords: machine learning, data quality, quality dimension, quality assessment
Procedia PDF Downloads 15025080 Belonging without Believing: Life Narratives of Six Social Generations of Members of the Apostolic Society
Authors: Frederique A. Demeijer
Abstract:
This article addresses the religious beliefs of members of the Apostolic Society –a Dutch religious community wherein the oldest living members were raised with very different beliefs than those upheld today. Currently, the Apostolic Society is the largest liberal religious community of the Netherlands, consisting of roughly 15,000 members. It is characterized by its close-knit community life and the importance of its apostle: the spiritual leader who writes a weekly letter around which the Sunday morning service is centered. The society sees itself as ‘religious-humanistic’, inspired by its Judeo-Christian roots without being dogmatic. Only a century earlier, the beliefs of the religious community revolved more strongly around the Bible, the apostle is a link to Christ. Also, the community believed in the return of the Lord, resonating with the millenarian roots of community in 1830. Thus, the oldest living members have experienced fundamental changes in beliefs and rituals, yet remained members. This article reveals how members experience(d) their religious beliefs and feelings of belonging to the community, how these may or may not have changed over time, and what role the Apostolic Society played in their lives. The article presents a qualitative research approach based on two main pillars. First, life narrative interviews were conducted, to work inductively and allow different interview topics to emerge. Second, it uses generational theory, in three ways: 1) to select respondents; 2) to guide the interview methodology –by being sensitive to differences in socio-historical context and events experienced during formative years of interviewees of different social generations, and 3) to analyze and contextualize the qualitative interview data. The data were gathered from 27 respondents, belonging to six social generations. All interviews were recorded, transcribed, coded, and analyzed, using the Atlas.ti software program. First, the elder generations talk about growing up with the Apostolic Society being absolutely central in their daily and spiritual lives. They spent most of their time with fellow members and dedicated their free time to Apostolic activities. The central beliefs of the Apostolic Society were clear and strongly upheld, and they experienced strong belonging. Although they now see the set of central beliefs to be more individually interpretable and are relieved to not have to spend all that time to Apostolic activities anymore, they still regularly attend services and speak longingly of the past with its strong belief and belonging. Second, the younger generations speak of growing up in a non-dogmatic, religious-humanist set of beliefs, but still with a very strong belonging to the religious community. They now go irregularly to services, and talk about belonging, but not as strong as the elderly generations do. Third, across the generations, members spend more time outside of the Apostolic Society than within. The way they speak about their religious beliefs is fluid and differs as much within generations as between: for example, there is no central view on what God is. It seems the experience of members of the Apostolic Society across different generations can now be characterized as belonging without believing.Keywords: generational theory, individual religious experiences, life narrative history interviews, qualitative research design
Procedia PDF Downloads 11325079 Exploring Data Leakage in EEG Based Brain-Computer Interfaces: Overfitting Challenges
Authors: Khalida Douibi, Rodrigo Balp, Solène Le Bars
Abstract:
In the medical field, applications related to human experiments are frequently linked to reduced samples size, which makes the training of machine learning models quite sensitive and therefore not very robust nor generalizable. This is notably the case in Brain-Computer Interface (BCI) studies, where the sample size rarely exceeds 20 subjects or a few number of trials. To address this problem, several resampling approaches are often used during the data preparation phase, which is an overly critical step in a data science analysis process. One of the naive approaches that is usually applied by data scientists consists in the transformation of the entire database before the resampling phase. However, this can cause model’ s performance to be incorrectly estimated when making predictions on unseen data. In this paper, we explored the effect of data leakage observed during our BCI experiments for device control through the real-time classification of SSVEPs (Steady State Visually Evoked Potentials). We also studied potential ways to ensure optimal validation of the classifiers during the calibration phase to avoid overfitting. The results show that the scaling step is crucial for some algorithms, and it should be applied after the resampling phase to avoid data leackage and improve results.Keywords: data leackage, data science, machine learning, SSVEP, BCI, overfitting
Procedia PDF Downloads 15325078 Damping Function and Dynamic Simulation of GUPFC Using IC-HS Algorithm
Authors: Galu Papy Yuma
Abstract:
This paper presents a new dynamic simulation of a power system consisting of four machines equipped with the Generalized Unified Power Flow Controller (GUPFC) to improve power system stability. The dynamic simulation of the GUPFC consists of one shunt converter and two series converters based on voltage source converter, and DC link capacitor installed in the power system. MATLAB/Simulink is used to arrange the dynamic simulation of the GUPFC, where the power system is simulated in order to investigate the impact of the controller on power system oscillation damping and to show the simulation program reliability. The Improved Chaotic- Harmony Search (IC-HS) Algorithm is used to provide the parameter controller in order to lead-lag compensation design. The results obtained by simulation show that the power system with four machines is suitable for stability analysis. The use of GUPFC and IC-HS Algorithm provides the excellent capability in fast damping of power system oscillations and improve greatly the dynamic stability of the power system.Keywords: GUPFC, IC-HS algorithm, Matlab/Simulink, damping oscillation
Procedia PDF Downloads 44925077 Nuclear Decay Data Evaluation for 217Po
Authors: S. S. Nafee, A. M. Al-Ramady, S. A. Shaheen
Abstract:
Evaluated nuclear decay data for the 217Po nuclide ispresented in the present work. These data include recommended values for the half-life T1/2, α-, β--, and γ-ray emission energies and probabilities. Decay data from 221Rn α and 217Bi β—decays are presented. Q(α) has been updated based on the recent published work of the Atomic Mass Evaluation AME2012. In addition, the logft values were calculated using the Logft program from the ENSDF evaluation package. Moreover, the total internal conversion electrons has been calculated using Bricc program. Meanwhile, recommendation values or the multi-polarities have been assigned based on recently measurement yield a better intensity balance at the 254 keV and 264 keV gamma transitions.Keywords: nuclear decay data evaluation, mass evaluation, total converison coefficients, atomic mass evaluation
Procedia PDF Downloads 43325076 Geographic Information System Using Google Fusion Table Technology for the Delivery of Disease Data Information
Authors: I. Nyoman Mahayasa Adiputra
Abstract:
Data in the field of health can be useful for the purposes of data analysis, one example of health data is disease data. Disease data is usually in a geographical plot in accordance with the area. Where the data was collected, in the city of Denpasar, Bali. Disease data report is still published in tabular form, disease information has not been mapped in GIS form. In this research, disease information in Denpasar city will be digitized in the form of a geographic information system with the smallest administrative area in the form of district. Denpasar City consists of 4 districts of North Denpasar, East Denpasar, West Denpasar and South Denpasar. In this research, we use Google fusion table technology for map digitization process, where this technology can facilitate from the administrator and from the recipient information. From the administrator side of the input disease, data can be done easily and quickly. From the receiving end of the information, the resulting GIS application can be published in a website-based application so that it can be accessed anywhere and anytime. In general, the results obtained in this study, divided into two, namely: (1) Geolocation of Denpasar and all of Denpasar districts, the process of digitizing the map of Denpasar city produces a polygon geolocation of each - district of Denpasar city. These results can be utilized in subsequent GIS studies if you want to use the same administrative area. (2) Dengue fever mapping in 2014 and 2015. Disease data used in this study is dengue fever case data taken in 2014 and 2015. Data taken from the profile report Denpasar Health Department 2015 and 2016. This mapping can be useful for the analysis of the spread of dengue hemorrhagic fever in the city of Denpasar.Keywords: geographic information system, Google fusion table technology, delivery of disease data information, Denpasar city
Procedia PDF Downloads 13225075 Inclusive Practices in Health Sciences: Equity Proofing Higher Education Programs
Authors: Mitzi S. Brammer
Abstract:
Given that the cultural make-up of programs of study in institutions of higher learning is becoming increasingly diverse, much has been written about cultural diversity from a university-level perspective. However, there are little data in the way of specific programs and how they address inclusive practices when teaching and working with marginalized populations. This research study aimed to discover baseline knowledge and attitudes of health sciences faculty, instructional staff, and students related to inclusive teaching/learning and interactions. Quantitative data were collected via an anonymous online survey (one designed for students and another designed for faculty/instructional staff) using a web-based program called Qualtrics. Quantitative data were analyzed amongst the faculty/instructional staff and students, respectively, using descriptive and comparative statistics (t-tests). Additionally, some participants voluntarily engaged in a focus group discussion in which qualitative data were collected around these same variables. Collecting qualitative data to triangulate the quantitative data added trustworthiness to the overall data. The research team analyzed collected data and compared identified categories and trends, comparing those data between faculty/staff and students, and reported results as well as implications for future study and professional practice.Keywords: inclusion, higher education, pedagogy, equity, diversity
Procedia PDF Downloads 6825074 Annotation Ontology for Semantic Web Development
Authors: Hadeel Al Obaidy, Amani Al Heela
Abstract:
The main purpose of this paper is to examine the concept of semantic web and the role that ontology and semantic annotation plays in the development of semantic web services. The paper focuses on semantic web infrastructure illustrating how ontology and annotation work to provide the learning capabilities for building content semantically. To improve productivity and quality of software, the paper applies approaches, notations and techniques offered by software engineering. It proposes a conceptual model to develop semantic web services for the infrastructure of web information retrieval system of digital libraries. The developed system uses ontology and annotation to build a knowledge based system to define and link the meaning of a web content to retrieve information for users’ queries. The results are more relevant through keywords and ontology rule expansion that will be more accurate to satisfy the requested information. The level of results accuracy would be enhanced since the query semantically analyzed work with the conceptual architecture of the proposed system.Keywords: semantic web services, software engineering, semantic library, knowledge representation, ontology
Procedia PDF Downloads 17425073 An Analysis of Sequential Pattern Mining on Databases Using Approximate Sequential Patterns
Authors: J. Suneetha, Vijayalaxmi
Abstract:
Sequential Pattern Mining involves applying data mining methods to large data repositories to extract usage patterns. Sequential pattern mining methodologies used to analyze the data and identify patterns. The patterns have been used to implement efficient systems can recommend on previously observed patterns, in making predictions, improve usability of systems, detecting events, and in general help in making strategic product decisions. In this paper, identified performance of approximate sequential pattern mining defines as identifying patterns approximately shared with many sequences. Approximate sequential patterns can effectively summarize and represent the databases by identifying the underlying trends in the data. Conducting an extensive and systematic performance over synthetic and real data. The results demonstrate that ApproxMAP effective and scalable in mining large sequences databases with long patterns.Keywords: multiple data, performance analysis, sequential pattern, sequence database scalability
Procedia PDF Downloads 34625072 Video-On-Demand QoE Evaluation across Different Age-Groups and Its Significance for Network Capacity
Authors: Mujtaba Roshan, John A. Schormans
Abstract:
Quality of Experience (QoE) drives churn in the broadband networks industry, and good QoE plays a large part in the retention of customers. QoE is known to be affected by the Quality of Service (QoS) factors packet loss probability (PLP), delay and delay jitter caused by the network. Earlier results have shown that the relationship between these QoS factors and QoE is non-linear, and may vary from application to application. We use the network emulator Netem as the basis for experimentation, and evaluate how QoE varies as we change the emulated QoS metrics. Focusing on Video-on-Demand, we discovered that the reported QoE may differ widely for users of different age groups, and that the most demanding age group (the youngest) can require an order of magnitude lower PLP to achieve the same QoE than is required by the most widely studied age group of users. We then used a bottleneck TCP model to evaluate the capacity cost of achieving an order of magnitude decrease in PLP, and found it be (almost always) a 3-fold increase in link capacity that was required.Keywords: network capacity, packet loss probability, quality of experience, quality of service
Procedia PDF Downloads 27425071 The Impact of Artificial Intelligence on Construction Engineering
Authors: Mina Fawzy Ishak Gad Elsaid
Abstract:
There is a strong link between technology and development. Architecture as a profession is a call to service and society. Maybe next to soldiers, engineers and patriots. However, unlike soldiers, they always remain employees of society under all circumstances. Despite the construction profession's role in society, there appears to be a lack of respect as some projects fail. This paper focuses on the need to improve development engineering performance in developing countries, using engineering education in Nigerian universities as a tool for discussion. A purposeful survey, interviews and focus group discussions were conducted on one hundred and twenty (120) prominent companies in Nigeria. The subject is approached through a large number of projects that companies have been involved in from the planning stage, some of which have been completed and even reached the maintenance and monitoring stage. It has been found that certain factors beyond the control of engineers are hindering the full development and success of the construction sector in developing countries. The main culprit is corruption and its eradication will put the country on a stable path to develop construction and combat poverty.Keywords: decision analysis, industrial engineering, direct vs. indirect values, engineering management
Procedia PDF Downloads 4625070 The Impact of Artificial Intelligence on Construction Engineering
Authors: Haneen Joseph Habib Yeldoka
Abstract:
There is a strong link between technology and development. Architecture as a profession is a call to service and society. Maybe next to soldiers, engineers and patriots. However, unlike soldiers, they always remain employees of society under all circumstances. Despite the construction profession's role in society, there appears to be a lack of respect as some projects fail. This paper focuses on the need to improve development engineering performance in developing countries, using engineering education in Nigerian universities as a tool for discussion. A purposeful survey, interviews and focus group discussions were conducted on one hundred and twenty (120) prominent companies in Nigeria. The subject is approached through a large number of projects that companies have been involved in from the planning stage, some of which have been completed and even reached the maintenance and monitoring stage. It has been found that certain factors beyond the control of engineers are hindering the full development and success of the construction sector in developing countries. The main culprit is corruption and its eradication will put the country on a stable path to develop construction and combat poverty.Keywords: decision analysis, industrial engineering, direct vs. indirect values, engineering management
Procedia PDF Downloads 4225069 Medical Knowledge Management since the Integration of Heterogeneous Data until the Knowledge Exploitation in a Decision-Making System
Authors: Nadjat Zerf Boudjettou, Fahima Nader, Rachid Chalal
Abstract:
Knowledge management is to acquire and represent knowledge relevant to a domain, a task or a specific organization in order to facilitate access, reuse and evolution. This usually means building, maintaining and evolving an explicit representation of knowledge. The next step is to provide access to that knowledge, that is to say, the spread in order to enable effective use. Knowledge management in the medical field aims to improve the performance of the medical organization by allowing individuals in the care facility (doctors, nurses, paramedics, etc.) to capture, share and apply collective knowledge in order to make optimal decisions in real time. In this paper, we propose a knowledge management approach based on integration technique of heterogeneous data in the medical field by creating a data warehouse, a technique of extracting knowledge from medical data by choosing a technique of data mining, and finally an exploitation technique of that knowledge in a case-based reasoning system.Keywords: data warehouse, data mining, knowledge discovery in database, KDD, medical knowledge management, Bayesian networks
Procedia PDF Downloads 39625068 Mean Shift-Based Preprocessing Methodology for Improved 3D Buildings Reconstruction
Authors: Nikolaos Vassilas, Theocharis Tsenoglou, Djamchid Ghazanfarpour
Abstract:
In this work we explore the capability of the mean shift algorithm as a powerful preprocessing tool for improving the quality of spatial data, acquired from airborne scanners, from densely built urban areas. On one hand, high resolution image data corrupted by noise caused by lossy compression techniques are appropriately smoothed while at the same time preserving the optical edges and, on the other, low resolution LiDAR data in the form of normalized Digital Surface Map (nDSM) is upsampled through the joint mean shift algorithm. Experiments on both the edge-preserving smoothing and upsampling capabilities using synthetic RGB-z data show that the mean shift algorithm is superior to bilateral filtering as well as to other classical smoothing and upsampling algorithms. Application of the proposed methodology for 3D reconstruction of buildings of a pilot region of Athens, Greece results in a significant visual improvement of the 3D building block model.Keywords: 3D buildings reconstruction, data fusion, data upsampling, mean shift
Procedia PDF Downloads 31625067 Strengthening Islamic Banking Customer Behavioral Intention through Value and Commitment
Authors: Mornay Roberts-Lombard
Abstract:
Consumers’ perceptions of value are crucial to ensuring their future commitment and behavioral intentions. As a result, service providers, such as Islamic banks, must provide their customers with products and services that are regarded as valuable, stimulating, collaborative, and competent. Therefore, the value provided to customers must meet or surpass their expectations, which can drive customers’ commitment (affective and calculative) and eventually favorably impact their future behavioral intentions. Consequently, Islamic banks in South Africa, as a growing African market, need to obtain a better understanding of the variables that impact Islamic banking customers’ value perceptions and how these impact their future behavioral intentions. Furthermore, it is necessary to investigate how customers’ perceived value perceptions impact their affective and calculative commitment and how the latter impact their future behavioral intentions. The purpose of this study is to bridge these gaps in knowledge, as the competitiveness of the Islamic banking industry in South Africa requires a deeper understanding of the aforementioned relationships. The study was exploratory and quantitative in nature, and data was collected from 250 Islamic banking customers using self-administered questionnaires. These banking customers resided in the Gauteng province of South Africa. Exploratory factor analysis, Pearson’s coefficient analysis, and multiple regression analysis were applied to measure the proposed hypotheses developed for the study. This research will aid Islamic banks in the country in potentially strengthening customers’ future commitment (affective and calculative) and positively impact their future behavioral intentions. The findings of the study established that service quality has a significant and positive impact on perceived value. Moreover, it was determined that perceived value has a favorable and considerable impact on affective and calculative commitment, while calculative commitment has a beneficial impact on behavioral intention. The research informs Islamic banks of the importance of service engagement in driving customer perceived value, which stimulates the future affective and calculative commitment of Islamic bank customers in an emerging market context. Finally, the study proposes guidelines for Islamic banks to develop an enhanced understanding of the factors that impact the perceived value-commitment-behavioral intention link in a competitive Islamic banking market in South Africa.Keywords: perceived value, affective commitment, calculative commitment, behavioural intention
Procedia PDF Downloads 80