Search results for: database approach
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 15171

Search results for: database approach

14781 Linking Business Process Models and System Models Based on Business Process Modelling

Authors: Faisal A. Aburub

Abstract:

Organizations today need to invest in software in order to run their businesses, and to the organizations’ objectives, the software should be in line with the business process. This research presents an approach for linking process models and system models. Particularly, the new approach aims to synthesize sequence diagram based on role activity diagram (RAD) model. The approach includes four steps namely: Create business process model using RAD, identify computerized activities, identify entities in sequence diagram and identify messages in sequence diagram. The new approach has been validated using the process of student registration in University of Petra as a case study. Further research is required to validate the new approach using different domains.

Keywords: business process modelling, system models, role activity diagrams, sequence diagrams

Procedia PDF Downloads 384
14780 Towards a Goal-Question-Metric Based Approach to Assess Social Sustainability of Software Systems

Authors: Rahma Amri, Narjès Bellamine Ben Saoud

Abstract:

Sustainable development or sustainability is one of the most urgent issues in actual debate in almost domains. Particularly the significant way the software pervades our live should make it in the center of sustainability concerns. The social aspects of sustainability haven’t been well studied in the context of software systems and still immature research field that needs more interest among researchers’ community. This paper presents a Goal-Question-Metric based approach to assess social sustainability of software systems. The approach is based on a generic social sustainability model taken from Social sciences.

Keywords: software assessment approach, social sustainability, goal-question-metric paradigm, software project metrics

Procedia PDF Downloads 394
14779 Website Appeal’s Impact on Brand Outcomes: The Mediated Effect of Emotional Attractiveness in the Relationship between Consistent Image and Brand Value

Authors: Salvador Treviño-Martinez, Christian Reich-Lopez

Abstract:

This paper investigates the relationship between website appeal and brand value outcomes (brand attraction, brand loyalty, brand relationship, and brand experience), considering the mediating effect of emotional attractiveness. Data were collected from 221 customers of a quick-service restaurant in Culiacan, Mexico, using an online survey distributed via WhatsApp, following the clients' navigation of the restaurant's website. The study employed PLS-SEM to test the proposed hypotheses and performed 5,000 bootstrapping subsamples to obtain results. The findings indicate that consistent image, a key component of website appeal, has a statistically significant direct and mediated effect (through emotional attractiveness) on the aforementioned brand outcomes. The study's limitations include the convenience sampling method and the single company client database used for the sample composition. This research contributes to the branding and website quality literature by testing nine hypotheses using the Stimuli-Organism-Response theoretical approach in an underexplored context: quick-service restaurants in Latin America.

Keywords: website appeal, branding, emotional attractiveness, consistent image, website quality

Procedia PDF Downloads 93
14778 Adopting Flocks of Birds Approach to Predator for Anomalies Detection on Industrial Control Systems

Authors: M. Okeke, A. Blyth

Abstract:

Industrial Control Systems (ICS) such as Supervisory Control And Data Acquisition (SCADA) can be seen in many different critical infrastructures, from nuclear management to utility, medical equipment, power, waste and engine management on ships and planes. The role SCADA plays in critical infrastructure has resulted in a call to secure them. Many lives depend on it for daily activities and the attack vectors are becoming more sophisticated. Hence, the security of ICS is vital as malfunction of it might result in huge risk. This paper describes how the application of Prey Predator (PP) approach in flocks of birds could enhance the detection of malicious activities on ICS. The PP approach explains how these animals in groups or flocks detect predators by following some simple rules. They are not necessarily very intelligent animals but their approach in solving complex issues such as detection through corporation, coordination and communication worth emulating. This paper will emulate flocking behavior seen in birds in detecting predators. The PP approach will adopt six nearest bird approach in detecting any predator. Their local and global bests are based on the individual detection as well as group detection. The PP algorithm was designed following MapReduce methodology that follows a Split Detection Convergence (SDC) approach.

Keywords: artificial life, industrial control system (ICS), IDS, prey predator (PP), SCADA, SDC

Procedia PDF Downloads 301
14777 An Approximation of Daily Rainfall by Using a Pixel Value Data Approach

Authors: Sarisa Pinkham, Kanyarat Bussaban

Abstract:

The research aims to approximate the amount of daily rainfall by using a pixel value data approach. The daily rainfall maps from the Thailand Meteorological Department in period of time from January to December 2013 were the data used in this study. The results showed that this approach can approximate the amount of daily rainfall with RMSE=3.343.

Keywords: daily rainfall, image processing, approximation, pixel value data

Procedia PDF Downloads 387
14776 Developmental Social Work: A Derailed Post-Apartheid Development Approach in South Africa

Authors: P. Mbecke

Abstract:

Developmental social welfare implemented through developmental social work is being applauded internationally as an approach that facilitates social development theory and practice. However, twenty-two years into democracy, there are no tangible evidences that the much-desired developmental social welfare approach has assisted the post-apartheid macroeconomic policy frameworks in addressing poverty and inequality, thus, the derailment of the post-apartheid development approach in South Africa. Based on the implementation research theory, and the literature review technique, this paper recognizes social work as a principal role-player in social development. It recommends the redesign and implementation of an effective developmental social welfare approach with specific strategies, programs, activities and sufficient resources aligned to and appropriate in delivering on the promises of the government’s macroeconomic policy frameworks. Such approach should be implemented by skilled and dedicated developmental social workers in order to achieve transformation in South Africa.

Keywords: apartheid, developmental social welfare, developmental social work, inequality, poverty alleviation, social development, South Africa

Procedia PDF Downloads 365
14775 Combined Proteomic and Metabolomic Analysis Approaches to Investigate the Modification in the Proteome and Metabolome of in vitro Models Treated with Gold Nanoparticles (AuNPs)

Authors: H. Chassaigne, S. Gioria, J. Lobo Vicente, D. Carpi, P. Barboro, G. Tomasi, A. Kinsner-Ovaskainen, F. Rossi

Abstract:

Emerging approaches in the area of exposure to nanomaterials and assessment of human health effects combine the use of in vitro systems and analytical techniques to study the perturbation of the proteome and/or the metabolome. We investigated the modification in the cytoplasmic compartment of the Balb/3T3 cell line exposed to gold nanoparticles. On one hand, the proteomic approach is quite standardized even if it requires precautions when dealing with in vitro systems. On the other hand, metabolomic analysis is challenging due to the chemical diversity of cellular metabolites that complicate data elaboration and interpretation. Differentially expressed proteins were found to cover a range of functions including stress response, cell metabolism, cell growth and cytoskeleton organization. In addition, de-regulated metabolites were annotated using the HMDB database. The "omics" fields hold huge promises in the interaction of nanoparticles with biological systems. The combination of proteomics and metabolomics data is possible however challenging.

Keywords: data processing, gold nanoparticles, in vitro systems, metabolomics, proteomics

Procedia PDF Downloads 503
14774 Comparative Analysis of Dissimilarity Detection between Binary Images Based on Equivalency and Non-Equivalency of Image Inversion

Authors: Adnan A. Y. Mustafa

Abstract:

Image matching is a fundamental problem that arises frequently in many aspects of robot and computer vision. It can become a time-consuming process when matching images to a database consisting of hundreds of images, especially if the images are big. One approach to reducing the time complexity of the matching process is to reduce the search space in a pre-matching stage, by simply removing dissimilar images quickly. The Probabilistic Matching Model for Binary Images (PMMBI) showed that dissimilarity detection between binary images can be accomplished quickly by random pixel mapping and is size invariant. The model is based on the gamma binary similarity distance that recognizes an image and its inverse as containing the same scene and hence considers them to be the same image. However, in many applications, an image and its inverse are not treated as being the same but rather dissimilar. In this paper, we present a comparative analysis of dissimilarity detection between PMMBI based on the gamma binary similarity distance and a modified PMMBI model based on a similarity distance that does distinguish between an image and its inverse as being dissimilar.

Keywords: binary image, dissimilarity detection, probabilistic matching model for binary images, image mapping

Procedia PDF Downloads 153
14773 An Exploration of Policy-related Documents on District Heating and Cooling in Flanders: A Slow and Bottom-up Process

Authors: Isaura Bonneux

Abstract:

District heating and cooling (DHC) is increasingly recognized as a viable path towards sustainable heating and cooling. While some countries like Sweden and Denmark have a longstanding tradition of DHC, Belgium is lacking behind. The Northern part of Belgium, Flanders, had only a total of 95 heating networks in July 2023. Nevertheless, it is increasingly exploring its possibilities to enhance the scope of DHC. DHC is a complex energy system, requiring a lot of collaboration between various stakeholders on various levels. Therefore, it is of interest to look closer at policy-related documents at the Flemish (regional) level, as these policies set the scene for DHC development in the Flemish region. This kind of analysis has not been undertaken so far. This paper has the following research question: “Who talks about DHC, and in which way and context is DHC discussed in Flemish policy-related documents?” To answer this question, the Overton policy database was used to search and retrieve relevant policy-related documents. Overton retrieves data from governments, think thanks, NGOs, and IGOs. In total, out of the 244 original results, 117 documents between 2009 and 2023 were analyzed. Every selected document included theme keywords, policymaking department(s), date, and document type. These elements were used for quantitative data description and visualization. Further, qualitative content analysis revealed patterns and main themes regarding DHC in Flanders. Four main conclusions can be drawn: First, it is obvious from the timeframe that DHC is a new topic in Flanders with still limited attention; 2014, 2016 and 2017 were the years with the most documents, yet this number is still only 12 documents. In addition, many documents talked about DHC but not much in depth and painted it as a future scenario with a lot of uncertainty around it. The largest part of the issuing government departments had a link to either energy or climate (e.g. Flemish Environmental Agency) or policy (e.g. Socio-Economic Council of Flanders) Second, DHC is mentioned most within an ‘Environment and Sustainability’ context, followed by ‘General Policy and Regulation’. This is intuitive, as DHC is perceived as a sustainable heating and cooling technique and this analysis compromises policy-related documents. Third, Flanders seems mostly interested in using waste or residual heat as a heating source for DHC. The harbors and waste incineration plants are identified as potential and promising supply sources. This approach tries to conciliate environmental and economic incentives. Last, local councils get assigned a central role and the initiative is mostly taken by them. The policy documents and policy advices demonstrate that Flanders opts for a bottom-up organization. As DHC is very dependent on local conditions, this seems a logic step. Nevertheless, this can impede smaller councils to create DHC networks and slow down systematic and fast implementation of DHC throughout Flanders.

Keywords: district heating and cooling, flanders, overton database, policy analysis

Procedia PDF Downloads 44
14772 A Consensus Approach to the Formulation of a School ICT Policy: A Q-Methodology Case Study

Authors: Thiru Vandeyar

Abstract:

This study sets out to explore how teachers’ beliefs and attitudes about ICT policy influence a consensus approach to the formulation of a school ICT policy. This case study proposes Q- methodology as an innovative method to facilitate a school’s capacity to develop policy reflecting teacher beliefs and attitudes. Q-methodology is used as a constructivist approach to the formulation of an ICT policy. Data capture was a mix of Q-methodology and qualitative principles. Data was analyzed by means of document, content and cluster analysis methods. Findings were threefold: First, teachers’ beliefs and attitudes about ICT policy influenced a consensus approach by including teachers as policy decision-makers. Second, given the opportunity, teachers have the inherent ability to deconstruct and critically engage with policy statements according to their own professional beliefs and attitudes. And third, an inclusive approach to policy formulation may inform the practice of school leaders and policymakers alike on how schools may develop their own policy.

Keywords: ICT, policy, teacher beliefs, consensus

Procedia PDF Downloads 509
14771 Content Based Instruction: An Interdisciplinary Approach in Promoting English Language Competence

Authors: Sanjeeb Kumar Mohanty

Abstract:

Content Based Instruction (CBI) in English Language Teaching (ELT) basically helps English as Second Language (ESL) learners of English. At the same time, it fosters multidisciplinary style of learning by promoting collaborative learning style. It is an approach to teaching ESL that attempts to combine language with interdisciplinary learning for bettering language proficiency and facilitating content learning. Hence, the basic purpose of CBI is that language should be taught in conjunction with academic subject matter. It helps in establishing the content as well as developing language competency. This study aims at supporting the potential values of interdisciplinary approach in promoting English Language Learning (ELL) by teaching writing skills to a small group of learners and discussing the findings with the teachers from various disciplines in a workshop. The teachers who are oriented, they use the same approach in their classes collaboratively. The inputs from the learners as well as the teachers hopefully raise positive consciousness with regard to the vast benefits that Content Based Instruction can offer in advancing the language competence of the learners.

Keywords: content based instruction, interdisciplinary approach, writing skills, collaborative approach

Procedia PDF Downloads 276
14770 Efficiency and Equity in Italian Secondary School

Authors: Giorgia Zotti

Abstract:

This research comprehensively investigates the multifaceted interplay determining school performance, individual backgrounds, and regional disparities within the landscape of Italian secondary education. Leveraging data gleaned from the INVALSI 2021-2022 database, the analysis meticulously scrutinizes two fundamental distributions of educational achievements: the standardized Invalsi test scores and official grades in Italian and Mathematics, focusing specifically on final-year secondary school students in Italy. Applying a comprehensive methodology, the study initially employs Data Envelopment Analysis (DEA) to assess school performances. This methodology involves constructing a production function encompassing inputs (hours spent at school) and outputs (Invalsi scores in Italian and Mathematics, along with official grades in Italian and Math). The DEA approach is applied in both of its versions: traditional and conditional. The latter incorporates environmental variables such as school type, size, demographics, technological resources, and socio-economic indicators. Additionally, the analysis delves into regional disparities by leveraging the Theil Index, providing insights into disparities within and between regions. Moreover, in the frame of the inequality of opportunity theory, the study quantifies the inequality of opportunity in students' educational achievements. The methodology applied is the Parametric Approach in the ex-ante version, considering diverse circumstances like parental education and occupation, gender, school region, birthplace, and language spoken at home. Consequently, a Shapley decomposition is applied to understand how much each circumstance affects the outcomes. The outcomes of this comprehensive investigation unveil pivotal determinants of school performance, notably highlighting the influence of school type (Liceo) and socioeconomic status. The research unveils regional disparities, elucidating instances where specific schools outperform others in official grades compared to Invalsi scores, shedding light on the intricate nature of regional educational inequalities. Furthermore, it emphasizes a heightened inequality of opportunity within the distribution of Invalsi test scores in contrast to official grades, underscoring pronounced disparities at the student level. This analysis provides insights for policymakers, educators, and stakeholders, fostering a nuanced understanding of the complexities within Italian secondary education.

Keywords: inequality, education, efficiency, DEA approach

Procedia PDF Downloads 75
14769 Terrorism: Definition, History and Different Approaches in the Analysis of Terrorism Phenomenon

Authors: Shabnam Dadparvar, Laijin Shen, Farzad Ravanbod

Abstract:

Nowadays, the political phenomenon of terrorism is considered as an effective factor on political, social, and economic changes. It has replaced the recognized political phenomena such as revolutions, wars (total war among two or more political units with distinct identities in the form of national states), coups d’état, insurgencies and etc. and has challenged political life in all its levels (sub national, national, and international political groups). In this paper by using descriptive-analytical method, the authors try to explain the spread of this political phenomenon across the world, its definition and types, also analyze different approaches to understand it. The authors believe that the Logical-Rational approach is the best way to explain and understand this phenomenon.

Keywords: logical approach, psychological- social approach, religious approach, terrorism

Procedia PDF Downloads 334
14768 SPBAC: A Semantic Policy-Based Access Control for Database Query

Authors: Aaron Zhang, Alimire Kahaer, Gerald Weber, Nalin Arachchilage

Abstract:

Access control is an essential safeguard for the security of enterprise data, which controls users’ access to information resources and ensures the confidentiality and integrity of information resources [1]. Research shows that the more common types of access control now have shortcomings [2]. In this direction, to improve the existing access control, we have studied the current technologies in the field of data security, deeply investigated the previous data access control policies and their problems, identified the existing deficiencies, and proposed a new extension structure of SPBAC. SPBAC extension proposed in this paper aims to combine Policy-Based Access Control (PBAC) with semantics to provide logically connected, real-time data access functionality by establishing associations between enterprise data through semantics. Our design combines policies with linked data through semantics to create a "Semantic link" so that access control is no longer per-database and determines that users in each role should be granted access based on the instance policy, and improves the SPBAC implementation by constructing policies and defined attributes through the XACML specification, which is designed to extend on the original XACML model. While providing relevant design solutions, this paper hopes to continue to study the feasibility and subsequent implementation of related work at a later stage.

Keywords: access control, semantic policy-based access control, semantic link, access control model, instance policy, XACML

Procedia PDF Downloads 90
14767 The Constructivist Approach to Teaching Second Language Writing

Authors: Andreea Cervatiuc

Abstract:

This study focuses on teaching second language writing through a constructivist approach. Unlike traditional approaches to teaching second language writing, which were product-oriented and emphasized surface features of writing, such as spelling and grammar, the constructivist approach to teaching second language writing is process-oriented and fosters discovery of meaning, creativity, collaboration, and writing for an audience. Educators who take a constructivist approach to teaching second language writing create communities of writers in their classrooms, emphasize that the goal of writing is to share ideas with others, and engage their students in collaborative, creative, and authentic writing activities, such as writing conferences, group story writing, finish the story, and chain writing. The constructivist approach to teaching second language writing combines a focus on genres, scaffolding, and treating writing as a process. Through constructivist writing, students co-create knowledge and engage in meaningful dialogue with various texts and their peers. The findings of this study can have implications for applied linguists, teachers, and language learners.

Keywords: constructivist second language, writing genres, process writing, scaffolding

Procedia PDF Downloads 10
14766 Hindi Speech Synthesis by Concatenation of Recognized Hand Written Devnagri Script Using Support Vector Machines Classifier

Authors: Saurabh Farkya, Govinda Surampudi

Abstract:

Optical Character Recognition is one of the current major research areas. This paper is focussed on recognition of Devanagari script and its sound generation. This Paper consists of two parts. First, Optical Character Recognition of Devnagari handwritten Script. Second, speech synthesis of the recognized text. This paper shows an implementation of support vector machines for the purpose of Devnagari Script recognition. The Support Vector Machines was trained with Multi Domain features; Transform Domain and Spatial Domain or Structural Domain feature. Transform Domain includes the wavelet feature of the character. Structural Domain consists of Distance Profile feature and Gradient feature. The Segmentation of the text document has been done in 3 levels-Line Segmentation, Word Segmentation, and Character Segmentation. The pre-processing of the characters has been done with the help of various Morphological operations-Otsu's Algorithm, Erosion, Dilation, Filtration and Thinning techniques. The Algorithm was tested on the self-prepared database, a collection of various handwriting. Further, Unicode was used to convert recognized Devnagari text into understandable computer document. The document so obtained is an array of codes which was used to generate digitized text and to synthesize Hindi speech. Phonemes from the self-prepared database were used to generate the speech of the scanned document using concatenation technique.

Keywords: Character Recognition (OCR), Text to Speech (TTS), Support Vector Machines (SVM), Library of Support Vector Machines (LIBSVM)

Procedia PDF Downloads 499
14765 Local Binary Patterns-Based Statistical Data Analysis for Accurate Soccer Match Prediction

Authors: Mohammad Ghahramani, Fahimeh Saei Manesh

Abstract:

Winning a soccer game is based on thorough and deep analysis of the ongoing match. On the other hand, giant gambling companies are in vital need of such analysis to reduce their loss against their customers. In this research work, we perform deep, real-time analysis on every soccer match around the world that distinguishes our work from others by focusing on particular seasons, teams and partial analytics. Our contributions are presented in the platform called “Analyst Masters.” First, we introduce various sources of information available for soccer analysis for teams around the world that helped us record live statistical data and information from more than 50,000 soccer matches a year. Our second and main contribution is to introduce our proposed in-play performance evaluation. The third contribution is developing new features from stable soccer matches. The statistics of soccer matches and their odds before and in-play are considered in the image format versus time including the halftime. Local Binary patterns, (LBP) is then employed to extract features from the image. Our analyses reveal incredibly interesting features and rules if a soccer match has reached enough stability. For example, our “8-minute rule” implies if 'Team A' scores a goal and can maintain the result for at least 8 minutes then the match would end in their favor in a stable match. We could also make accurate predictions before the match of scoring less/more than 2.5 goals. We benefit from the Gradient Boosting Trees, GBT, to extract highly related features. Once the features are selected from this pool of data, the Decision trees decide if the match is stable. A stable match is then passed to a post-processing stage to check its properties such as betters’ and punters’ behavior and its statistical data to issue the prediction. The proposed method was trained using 140,000 soccer matches and tested on more than 100,000 samples achieving 98% accuracy to select stable matches. Our database from 240,000 matches shows that one can get over 20% betting profit per month using Analyst Masters. Such consistent profit outperforms human experts and shows the inefficiency of the betting market. Top soccer tipsters achieve 50% accuracy and 8% monthly profit in average only on regional matches. Both our collected database of more than 240,000 soccer matches from 2012 and our algorithm would greatly benefit coaches and punters to get accurate analysis.

Keywords: soccer, analytics, machine learning, database

Procedia PDF Downloads 238
14764 Reforming Corporate Criminal Liability in English Law: Lessons and Experiences from Canada

Authors: John Kong Shan Ho

Abstract:

In June 2022, the Law Commission of England and Wales published an options paper to examine how the law on corporate criminal liability can be reformed under the English system. The paper merely details options for reform and does not seek to make recommendations. However, the paper has ruled out the “respondeat superior” approach of the US and “corporate culture” approach of Australia as reform options. On balance, the preferred reform option of the Law Commission is the “senior officer” approach as currently adopted in Canada. This article is written against such background and argues that due to similarities between the English and Canadian systems, the latter’s approach is more ideal to be adopted by the former as a model for reform in this area.

Keywords: corporate criminal liability, identification principle, directing mind and will, England, Canada

Procedia PDF Downloads 105
14763 Relationship Between Health Coverage and Emergency Disease Burden

Authors: Karim Hajjar, Luis Lillo, Diego Martinez, Manuel Hermosilla, Nicholas Risko

Abstract:

Objectives: This study examines the relationship between universal health coverage (UCH) and the burden of emergency diseases at a global level. Methods: Data on Disability-Adjusted Life Years (DALYs) from emergency conditions were extracted from the Institute for Health Metrics and Evaluation (IHME) database for the years 2015 and 2019. Data on UHC, measured using two variables, 1) coverage of essential health services and 2) proportion of population spending more than 10% of household income on out-of-pocket health care expenditure, was extracted from the World Bank Database for years preceding our outcome of interest. Linear regression was performed, analyzing the effect of the UHC variables on the DALYs of emergency diseases, controlling for other variables. Results: A total of 133 countries were included. 44.4% of the analyzed countries had coverage of essential health services index of at least 70/100, and 35.3% had at least 10% of their population spend greater than 10% of their household income on healthcare. For every point increase in the coverage of essential health services index, there was a 13-point reduction in DALYs of emergency medical diseases (95% CI -16, -11). Conversely, for every percent decrease in the population with large household expenditure on healthcare, there was a 0.48 increase in DALYs of emergency medical diseases (95% CI -5.6, 4.7). Conclusions: After adjusting for multiple variables, an increase in coverage of essential health services was significantly associated with improvement in DALYs for emergency conditions. There was, however, no association between catastrophic health expenditure and DALYs.

Keywords: emergency medicine, universal healthcare, global health, health economics

Procedia PDF Downloads 92
14762 Improve B-Tree Index’s Performance Using Lock-Free Hash Table

Authors: Zhanfeng Ma, Zhiping Xiong, Hu Yin, Zhengwei She, Aditya P. Gurajada, Tianlun Chen, Ying Li

Abstract:

Many RDBMS vendors use B-tree index to achieve high performance for point queries and range queries, and some of them also employ hash index to further enhance the performance as hash table is more efficient for point queries. However, there are extra overheads to maintain a separate hash index, for example, hash mapping for all data records must always be maintained, which results in more memory space consumption; locking, logging and other mechanisms are needed to guarantee ACID, which affects the concurrency and scalability of the system. To relieve the overheads, Hash Cached B-tree (HCB) index is proposed in this paper, which consists of a standard disk-based B-tree index and an additional in-memory lock-free hash table. Initially, only the B-tree index is constructed for all data records, the hash table is built on the fly based on runtime workload, only data records accessed by point queries are indexed using hash table, this helps reduce the memory footprint. Changes to hash table are done using compare-and-swap (CAS) without performing locking and logging, this helps improve the concurrency and avoid contention. The hash table is also optimized to be cache conscious. HCB index is implemented in SAP ASE database, compared with the standard B-tree index, early experiments and customer adoptions show significant performance improvement. This paper provides an overview of the design of HCB index and reports the experimental results.

Keywords: B-tree, compare-and-swap, lock-free hash table, point queries, range queries, SAP ASE database

Procedia PDF Downloads 286
14761 Ways for University to Conduct Research Evaluation: Based on National Research University Higher School of Economics Example

Authors: Svetlana Petrikova, Alexander Yu Kostinskiy

Abstract:

Management of research evaluation in the Higher School of Economics (HSE) originates from the HSE Academic Fund created in 2004 to facilitate and support academic research and presents its results to international academic community. As the means to inspire the applicants, science projects went through competitive selection process evaluated by the group of experts. Drastic development of HSE, quantity of applied projects for each Academic Fund competition and the need to coordinate the conduct of expert evaluation resulted in founding of the Office for Research Evaluation in 2013. The Office’s primary objective is management of research evaluation of science projects. The standards to conduct the evaluation are defined as follows: - The exercise of the process approach, the unification of the functioning of department. - The uniformity of regulatory, organizational and methodological framework. - The development of proper on-line evaluation system. - The broad involvement of external Russian and international experts, the renouncement of the usage of own employees. - The development of an algorithm to make a correspondence between experts and science projects. - The methodical usage of opened/closed international and Russian databases to extend the expert database. - The transparency of evaluation results – free access to assessment while keeping experts confidentiality. The management of research evaluation of projects is based on the sole standard, organization and financing. The standard way of conducting research evaluation at HSE is based upon Regulations on basic principles for research evaluation at HSE. These Regulations have been developed from the moment of establishment of the Office for Research Evaluation and are based on conventional corporate standards for regulatory document management. The management system of research evaluation is implemented on the process approach basis. Process approach means deployment of work as a process, which is the aggregation of interrelated and interacting activities processing inputs into outputs. Inputs are firstly client asking for the assessment to be conducted, defining the conditions for organizing and carrying of the assessment and secondly the applicant with proper for the competition application; output is assessment given to the client. While exercising process approach to clarify interrelation and interacting main parties or subjects of the assessment are determined and the way for interaction between them forms up. Parties to expert assessment are: - Ordering Party – The department of the university taking the decision to subject a project to expert assessment; - Providing Party – The department of the university authorized to provide such assessment by the Ordering Party; - Performing Party – The legal and natural entities that have expertise in the area of research evaluation. Experts assess projects in accordance with criteria and states of expert opinions approved by the Ordering Party. Objects of assessment generally are applications or HSE competition project reports. Mainly assessments are deployed for internal needs, i.e. the most ordering parties are HSE branches and departments, but assessment can also be conducted for external clients. The financing of research evaluation at HSE is based on the established corporate culture and traditions of HSE.

Keywords: expert assessment, management of research evaluation, process approach, research evaluation

Procedia PDF Downloads 253
14760 A Comparative Study of the Impact of Membership in International Climate Change Treaties and the Environmental Kuznets Curve (EKC) in Line with Sustainable Development Theories

Authors: Mojtaba Taheri, Saied Reza Ameli

Abstract:

In this research, we have calculated the effect of membership in international climate change treaties for 20 developed countries based on the human development index (HDI) and compared this effect with the process of pollutant reduction in the Environmental Kuznets Curve (EKC) theory. For this purpose, the data related to The real GDP per capita with 2010 constant prices is selected from the World Development Indicators (WDI) database. Ecological Footprint (ECOFP) is the amount of biologically productive land needed to meet human needs and absorb carbon dioxide emissions. It is measured in global hectares (gha), and the data retrieved from the Global Ecological Footprint (2021) database will be used, and we will proceed by examining step by step and performing several series of targeted statistical regressions. We will examine the effects of different control variables, including Energy Consumption Structure (ECS) will be counted as the share of fossil fuel consumption in total energy consumption and will be extracted from The United States Energy Information Administration (EIA) (2021) database. Energy Production (EP) refers to the total production of primary energy by all energy-producing enterprises in one country at a specific time. It is a comprehensive indicator that shows the capacity of energy production in the country, and the data for its 2021 version, like the Energy Consumption Structure, is obtained from (EIA). Financial development (FND) is defined as the ratio of private credit to GDP, and to some extent based on the stock market value, also as a ratio to GDP, and is taken from the (WDI) 2021 version. Trade Openness (TRD) is the sum of exports and imports of goods and services measured as a share of GDP, and we use the (WDI) data (2021) version. Urbanization (URB) is defined as the share of the urban population in the total population, and for this data, we used the (WDI) data source (2021) version. The descriptive statistics of all the investigated variables are presented in the results section. Related to the theories of sustainable development, Environmental Kuznets Curve (EKC) is more significant in the period of study. In this research, we use more than fourteen targeted statistical regressions to purify the net effects of each of the approaches and examine the results.

Keywords: climate change, globalization, environmental economics, sustainable development, international climate treaty

Procedia PDF Downloads 71
14759 A Robust Spatial Feature Extraction Method for Facial Expression Recognition

Authors: H. G. C. P. Dinesh, G. Tharshini, M. P. B. Ekanayake, G. M. R. I. Godaliyadda

Abstract:

This paper presents a new spatial feature extraction method based on principle component analysis (PCA) and Fisher Discernment Analysis (FDA) for facial expression recognition. It not only extracts reliable features for classification, but also reduces the feature space dimensions of pattern samples. In this method, first each gray scale image is considered in its entirety as the measurement matrix. Then, principle components (PCs) of row vectors of this matrix and variance of these row vectors along PCs are estimated. Therefore, this method would ensure the preservation of spatial information of the facial image. Afterwards, by incorporating the spectral information of the eigen-filters derived from the PCs, a feature vector was constructed, for a given image. Finally, FDA was used to define a set of basis in a reduced dimension subspace such that the optimal clustering is achieved. The method of FDA defines an inter-class scatter matrix and intra-class scatter matrix to enhance the compactness of each cluster while maximizing the distance between cluster marginal points. In order to matching the test image with the training set, a cosine similarity based Bayesian classification was used. The proposed method was tested on the Cohn-Kanade database and JAFFE database. It was observed that the proposed method which incorporates spatial information to construct an optimal feature space outperforms the standard PCA and FDA based methods.

Keywords: facial expression recognition, principle component analysis (PCA), fisher discernment analysis (FDA), eigen-filter, cosine similarity, bayesian classifier, f-measure

Procedia PDF Downloads 425
14758 Applications of Analytical Probabilistic Approach in Urban Stormwater Modeling in New Zealand

Authors: Asaad Y. Shamseldin

Abstract:

Analytical probabilistic approach is an innovative approach for urban stormwater modeling. It can provide information about the long-term performance of a stormwater management facility without being computationally very demanding. This paper explores the application of the analytical probabilistic approach in New Zealand. The paper presents the results of a case study aimed at development of an objective way of identifying what constitutes a rainfall storm event and the estimation of the corresponding statistical properties of storms using two selected automatic rainfall stations located in the Auckland region in New Zealand. The storm identification and the estimation of the storm statistical properties are regarded as the first step in the development of the analytical probabilistic models. The paper provides a recommendation about the definition of the storm inter-event time to be used in conjunction with the analytical probabilistic approach.

Keywords: hydrology, rainfall storm, storm inter-event time, New Zealand, stormwater management

Procedia PDF Downloads 344
14757 Liquefaction Resistance Using Shear Wave Velocity

Authors: Filali Kamel, Sbartai Badreddine

Abstract:

The cyclic resistance curves developed by Andrus and Stokoe related to shear wave velocity case history databases are frequently used in accordance with the assumption of the Seed and Idriss simplified procedure. These cyclic resistance curves were deduced using a database according to the cyclic stress ratio (CSR) proposed by Seed and Idriss. Their approach is founded on the hypothesis that the dynamic cyclic shear stress (τd) is always less than that given by the simplified procedure (τr), as deduced by Seed and Idriss through their simplifying assumptions (rd= τd / τr <1). In 2017, Filali and Sbartai demonstrated that rd can often exceed 1, and they proposed a correction for the CSR in cases where rd > 1. Therefore, the correction of CSR implies that the cyclic resistance ratio (CRR) must also be corrected because it is defined by the boundary curve, which separates the liquefied and nonliqueified cases plotted using the original CSR of Seed and Idriss on which values of CRR are equal to CSR. For this purpose, in the context of this study, we have proposed in the range when the peak ground acceleration is ≤0.30g, which corresponds to rd>1, a modified boundary curve in accordance with the corrected version of the simplified method, which provides the safest case, generalize its use for any used earthquakes and allows the simplified method to be the more conservative.

Keywords: liquefaction, soil, earthquake, simplified method, cyclic stress ratio, cyclique resistance ratio

Procedia PDF Downloads 20
14756 Parametric Design as an Approach to Respond to Complexity

Authors: Sepideh Jabbari Behnam, Zahrasadat Saide Zarabadi

Abstract:

A city is an intertwined texture from the relationship of different components in a whole which is united in a one, so designing the whole complex and its planning is not an easy matter. By considering that a city is a complex system with infinite components and communications, providing flexible layouts that can respond to the unpredictable character of the city, which is a result of its complexity, is inevitable. Parametric design approach as a new approach can produce flexible and transformative layouts in any stage of design. This study aimed to introduce parametric design as a modern approach to respond to complex urban issues by using descriptive and analytical methods. This paper firstly introduces complex systems and then giving a brief characteristic of complex systems. The flexible design and layout flexibility is another matter in response and simulation of complex urban systems that should be considered in design, which is discussed in this study. In this regard, after describing the nature of the parametric approach as a flexible approach, as well as a tool and appropriate way to respond to features such as limited predictability, reciprocating nature, complex communications, and being sensitive to initial conditions and hierarchy, this paper introduces parametric design.

Keywords: complexity theory, complex system, flexibility, parametric design

Procedia PDF Downloads 365
14755 A Framework for Secure Information Flow Analysis in Web Applications

Authors: Ralph Adaimy, Wassim El-Hajj, Ghassen Ben Brahim, Hazem Hajj, Haidar Safa

Abstract:

Huge amounts of data and personal information are being sent to and retrieved from web applications on daily basis. Every application has its own confidentiality and integrity policies. Violating these policies can have broad negative impact on the involved company’s financial status, while enforcing them is very hard even for the developers with good security background. In this paper, we propose a framework that enforces security-by-construction in web applications. Minimal developer effort is required, in a sense that the developer only needs to annotate database attributes by a security class. The web application code is then converted into an intermediary representation, called Extended Program Dependence Graph (EPDG). Using the EPDG, the provided annotations are propagated to the application code and run against generic security enforcement rules that were carefully designed to detect insecure information flows as early as they occur. As a result, any violation in the data’s confidentiality or integrity policies is reported. As a proof of concept, two PHP web applications, Hotel Reservation and Auction, were used for testing and validation. The proposed system was able to catch all the existing insecure information flows at their source. Moreover and to highlight the simplicity of the suggested approaches vs. existing approaches, two professional web developers assessed the annotation tasks needed in the presented case studies and provided a very positive feedback on the simplicity of the annotation task.

Keywords: web applications security, secure information flow, program dependence graph, database annotation

Procedia PDF Downloads 471
14754 TARF: Web Toolkit for Annotating RNA-Related Genomic Features

Authors: Jialin Ma, Jia Meng

Abstract:

Genomic features, the genome-based coordinates, are commonly used for the representation of biological features such as genes, RNA transcripts and transcription factor binding sites. For the analysis of RNA-related genomic features, such as RNA modification sites, a common task is to correlate these features with transcript components (5'UTR, CDS, 3'UTR) to explore their distribution characteristics in terms of transcriptomic coordinates, e.g., to examine whether a specific type of biological feature is enriched near transcription start sites. Existing approaches for performing these tasks involve the manipulation of a gene database, conversion from genome-based coordinate to transcript-based coordinate, and visualization methods that are capable of showing RNA transcript components and distribution of the features. These steps are complicated and time consuming, and this is especially true for researchers who are not familiar with relevant tools. To overcome this obstacle, we develop a dedicated web app TARF, which represents web toolkit for annotating RNA-related genomic features. TARF web tool intends to provide a web-based way to easily annotate and visualize RNA-related genomic features. Once a user has uploaded the features with BED format and specified a built-in transcript database or uploaded a customized gene database with GTF format, the tool could fulfill its three main functions. First, it adds annotation on gene and RNA transcript components. For every features provided by the user, the overlapping with RNA transcript components are identified, and the information is combined in one table which is available for copy and download. Summary statistics about ambiguous belongings are also carried out. Second, the tool provides a convenient visualization method of the features on single gene/transcript level. For the selected gene, the tool shows the features with gene model on genome-based view, and also maps the features to transcript-based coordinate and show the distribution against one single spliced RNA transcript. Third, a global transcriptomic view of the genomic features is generated utilizing the Guitar R/Bioconductor package. The distribution of features on RNA transcripts are normalized with respect to RNA transcript landmarks and the enrichment of the features on different RNA transcript components is demonstrated. We tested the newly developed TARF toolkit with 3 different types of genomics features related to chromatin H3K4me3, RNA N6-methyladenosine (m6A) and RNA 5-methylcytosine (m5C), which are obtained from ChIP-Seq, MeRIP-Seq and RNA BS-Seq data, respectively. TARF successfully revealed their respective distribution characteristics, i.e. H3K4me3, m6A and m5C are enriched near transcription starting sites, stop codons and 5’UTRs, respectively. Overall, TARF is a useful web toolkit for annotation and visualization of RNA-related genomic features, and should help simplify the analysis of various RNA-related genomic features, especially those related RNA modifications.

Keywords: RNA-related genomic features, annotation, visualization, web server

Procedia PDF Downloads 207
14753 Description and Evaluation of the Epidemiological Surveillance System for Meningitis in the Province of Taza Between 2016 and 2020

Authors: Bennasser Samira

Abstract:

Meningitis, especially the meningococcal one, is a serious problem of public health. A system of vigilanceand surveillance is in place to allow effective actions to be taken on actual or potential health problems caused by all forms of meningitis. Objectives: 1. Describe the epidemiological surveillance system for meningitis in the province of Taza. 2. Evaluate the quality and responsiveness of the epidemiological surveillance system for meningitis in the province of Taza. 3. Propose measures to improve this system at the provincial level. Methods: This was a descriptive study with a purely quantitative approach by evaluating the quality and responsiveness of the system during 5 years between January 2016 and December 2020. We usedfor that the investigation files of meningitis cases and the provincial database of meningitis. We calculated some quality indicators of surveillance system already defined by the National Program for the Prevention and Control of Meningitis. Results: The notification is passive, the completeness of the data is quite good (94%), and the timeliness don’t exceed 71%. The quality of the data is acceptable (91% agreement). The systematic and rapid performance of lumbar punctures increases the diagnostic capabilities of the system. The local response actions are effected in 100%. Conclusion: The improvement of this surveillance system depends on strengthening the staff skills in diagnostic, reviewing surveillance tools, and encouraging judicious use of the data.

Keywords: evaluation, meningitis, system, taza, morocco

Procedia PDF Downloads 160
14752 Iris Cancer Detection System Using Image Processing and Neural Classifier

Authors: Abdulkader Helwan

Abstract:

Iris cancer, so called intraocular melanoma is a cancer that starts in the iris; the colored part of the eye that surrounds the pupil. There is a need for an accurate and cost-effective iris cancer detection system since the available techniques used currently are still not efficient. The combination of the image processing and artificial neural networks has a great efficiency for the diagnosis and detection of the iris cancer. Image processing techniques improve the diagnosis of the cancer by enhancing the quality of the images, so the physicians diagnose properly. However, neural networks can help in making decision; whether the eye is cancerous or not. This paper aims to develop an intelligent system that stimulates a human visual detection of the intraocular melanoma, so called iris cancer. The suggested system combines both image processing techniques and neural networks. The images are first converted to grayscale, filtered, and then segmented using prewitt edge detection algorithm to detect the iris, sclera circles and the cancer. The principal component analysis is used to reduce the image size and for extracting features. Those features are considered then as inputs for a neural network which is capable of deciding if the eye is cancerous or not, throughout its experience adopted by many training iterations of different normal and abnormal eye images during the training phase. Normal images are obtained from a public database available on the internet, “Mile Research”, while the abnormal ones are obtained from another database which is the “eyecancer”. The experimental results for the proposed system show high accuracy 100% for detecting cancer and making the right decision.

Keywords: iris cancer, intraocular melanoma, cancerous, prewitt edge detection algorithm, sclera

Procedia PDF Downloads 503