Search results for: quantitative techniques
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 9785

Search results for: quantitative techniques

8735 Digitizing Masterpieces in Italian Museums: Techniques, Challenges and Consequences from Giotto to Caravaggio

Authors: Ginevra Addis

Abstract:

The possibility of reproducing physical artifacts in a digital format is one of the opportunities offered by the technological advancements in information and communication most frequently promoted by museums. Indeed, the study and conservation of our cultural heritage have seen significant advancement due to the three-dimensional acquisition and modeling technology. A variety of laser scanning systems has been developed, based either on optical triangulation or on time-of-flight measurement, capable of producing digital 3D images of complex structures with high resolution and accuracy. It is necessary, however, to explore the challenges and opportunities that this practice brings within museums. The purpose of this paper is to understand what change is introduced by digital techniques in those museums that are hosting digital masterpieces. The methodology used will investigate three distinguished Italian exhibitions, related to the territory of Milan, trying to analyze the following issues about museum practices: 1) how digitizing art masterpieces increases the number of visitors; 2) what the need that calls for the digitization of artworks; 3) which techniques are most used; 4) what the setting is; 5) the consequences of a non-publication of hard copies of catalogues; 6) envision of these practices in the future. Findings will show how interconnection plays an important role in rebuilding a collection spread all over the world. Secondly how digital artwork duplication and extension of reality entail new forms of accessibility. Thirdly, that collection and preservation through digitization of images have both a social and educational mission. Fourthly, that convergence of the properties of different media (such as web, radio) is key to encourage people to get actively involved in digital exhibitions. The present analysis will suggest further research that should create museum models and interaction spaces that act as catalysts for innovation.

Keywords: digital masterpieces, education, interconnection, Italian museums, preservation

Procedia PDF Downloads 172
8734 Correlates of Multiplicity of Risk Behavior among Injecting Drug Users in Three High HIV Prevalence States of India

Authors: Santosh Sharma

Abstract:

Background: Drug abuse, needle sharing, and risky sexual behaviour are often compounded to increase the risk of HIV transmission. Injecting Drug Users are at the duel risk of needle sharing and risky sexual Behaviour, becoming more vulnerable to STI and HIV. Thus, studying the interface of injecting drug use and risky sexual behaviour is important to curb the pace of HIV epidemic among IDUs. The aim of this study is to determine the factor associated with HIV among injecting drug users in three states of India. Materials and methods: This paper analyzes covariates of multiplicity of risk behavior among injecting drug users. Findings are based on data from Integrated Behavioral and Biological Assessment (IBBA) round 2, 2010. IBBA collects the information of IDUs from the six districts. IDUs were selected on the criteria of those who were 18 years or older, who injected addictive substances/drugs for non-medical purposes at least once in past six month. A total of 1,979 in round 2 were interviewed in the IBBA. The study employs quantitative techniques using standard statistical tools to achieve the above objectives. All results presented in this paper are unweighted univariate measures. Results: Among IDUs, average duration of injecting drugs is 5.2 years. Mean duration between first drug use to first injecting drugs among younger IDUs, belongs to 18-24 years is 2.6 years Needle cleaning practices is common with above two-fifths reporting its every time cleaning. Needle sharing is quite prevalent especially among younger IDUs. Further, IDUs practicing needle sharing exhibit pervasive multi-partner behavior. Condom use with commercial partners is almost 81 %, whereas with intimate partner it is 39 %. Coexistence of needle sharing and unprotected sex enhances STI prevalence (6.8 %), which is further pronounced among divorced/separated/widowed (9.4 %). Conclusion: Working towards risk reduction for IDUs must deal with multiplicity of risk. Interventions should deal with covariates of risk, addressing youth, and risky sexual behavior.

Keywords: IDUs, HIV, STI, behaviour

Procedia PDF Downloads 276
8733 Identification of Author and Reviewer from Single and Double Blind Paper

Authors: Jatinderkumar R. Saini, Nikita. R. Sonthalia, Khushbu. A. Dodiya

Abstract:

Research leads to development of science and technology and hence to the betterment of humankind. Journals and conferences provide a platform to receive large number of research papers for publications and presentations before the expert and scientific community. In order to assure quality of such papers, they are also sent to reviewers for their comments. In order to maintain good ethical standards, the research papers are sent to reviewers in such a way that they do not know each other’s identity. This technique is called double-blind review process. It is called single-blind review process, if identity of any one party (generally authors) is disclosed to the other. This paper presents the techniques by which identity of author as well as reviewer could be made out even through double-blind review process. It is proposed that the characteristics and techniques presented here will help journals and conferences in assuring intentional or unintentional disclosure of identity revealing information by either party to the other.

Keywords: author, conference, double blind paper, journal, reviewer, single blind paper

Procedia PDF Downloads 346
8732 Performance Evaluation of Various Segmentation Techniques on MRI of Brain Tissue

Authors: U.V. Suryawanshi, S.S. Chowhan, U.V Kulkarni

Abstract:

Accuracy of segmentation methods is of great importance in brain image analysis. Tissue classification in Magnetic Resonance brain images (MRI) is an important issue in the analysis of several brain dementias. This paper portraits performance of segmentation techniques that are used on Brain MRI. A large variety of algorithms for segmentation of Brain MRI has been developed. The objective of this paper is to perform a segmentation process on MR images of the human brain, using Fuzzy c-means (FCM), Kernel based Fuzzy c-means clustering (KFCM), Spatial Fuzzy c-means (SFCM) and Improved Fuzzy c-means (IFCM). The review covers imaging modalities, MRI and methods for noise reduction and segmentation approaches. All methods are applied on MRI brain images which are degraded by salt-pepper noise demonstrate that the IFCM algorithm performs more robust to noise than the standard FCM algorithm. We conclude with a discussion on the trend of future research in brain segmentation and changing norms in IFCM for better results.

Keywords: image segmentation, preprocessing, MRI, FCM, KFCM, SFCM, IFCM

Procedia PDF Downloads 324
8731 Design and Construction Validation of Pile Performance through High Strain Pile Dynamic Tests for both Contiguous Flight Auger and Drilled Displacement Piles

Authors: S. Pirrello

Abstract:

Sydney’s booming real estate market has pushed property developers to invest in historically “no-go” areas, which were previously too expensive to develop. These areas are usually near rivers where the sites are underlain by deep alluvial and estuarine sediments. In these ground conditions, conventional bored pile techniques are often not competitive. Contiguous Flight Auger (CFA) and Drilled Displacement (DD) Piles techniques are on the other hand suitable for these ground conditions. This paper deals with the design and construction challenges encountered with these piling techniques for a series of high-rise towers in Sydney’s West. The advantages of DD over CFA piles such as reduced overall spoil with substantial cost savings and achievable rock sockets in medium strength bedrock are discussed. Design performances were assessed with PIGLET. Pile performances are validated in two stages, during constructions with the interpretation of real-time data from the piling rigs’ on-board computer data, and after construction with analyses of results from high strain pile dynamic testing (PDA). Results are then presented and discussed. High Strain testing data are presented as Case Pile Wave Analysis Program (CAPWAP) analyses.

Keywords: contiguous flight auger (CFA) , DEFPIG, case pile wave analysis program (CAPWAP), drilled displacement piles (DD), pile dynamic testing (PDA), PIGLET, PLAXIS, repute, pile performance

Procedia PDF Downloads 276
8730 A Fully-Automated Disturbance Analysis Vision for the Smart Grid Based on Smart Switch Data

Authors: Bernardo Cedano, Ahmed H. Eltom, Bob Hay, Jim Glass, Raga Ahmed

Abstract:

The deployment of smart grid devices such as smart meters and smart switches (SS) supported by a reliable and fast communications system makes automated distribution possible, and thus, provides great benefits to electric power consumers and providers alike. However, more research is needed before the full utility of smart switch data is realized. This paper presents new automated switching techniques using SS within the electric power grid. A concise background of the SS is provided, and operational examples are shown. Organization and presentation of data obtained from SS are shown in the context of the future goal of total automation of the distribution network. The description of application techniques, the examples of success with SS, and the vision outlined in this paper serve to motivate future research pertinent to disturbance analysis automation.

Keywords: disturbance automation, electric power grid, smart grid, smart switches

Procedia PDF Downloads 303
8729 Challenge Appraisal Job, Hindrance Appraisal Job, and Negative Work-Life Interaction with the Mediating Role of Distress: A Survey on Sabah Public Secondary School Teachers

Authors: Pan Lee Ching, Chua Bee Seok

Abstract:

The experience of negative work-life interaction often confronted with work related stress includes workload. The appraisal of challenge and hindrance jobs depend on the type of workload to stimulate stress response. Nevertheless, the effects of challenge and hindrance jobs on distress and negative work-life interaction are scarcely explored. Thus, research objective was to examine the relationship among challenge appraisal job (qualitative workload), hindrance appraisal job (quantitative workload), and negative work-life interaction with the mediating role of distress. A survey with random sampling method was performed on current serving public secondary school teachers in Sabah. Collected data showed 447 respondents completed three questionnaires, namely Challenge-hindrance Appraisal Scale, Stress Professional Positive and Negative Questionnaire, and Survey Work-home Interaction-Nijmegan. Partial Least Square-Structural Equation Modeling (PLS-SEM) was used to analyse mediation effect. Results showed distress fully mediates the relationship between challenge appraisal job (qualitative workload) and negative work-life interaction. The indirect effect was significant and negative. While distress partially mediates the relationship between hindrance appraisal job (quantitative workload) and negative work-life interaction. The indirect effect was significant and positive. The study implied that challenge appraisal job could be a positive resource for teacher to facilitate work and life, whereas hindrance appraisal job could disengage the facilitation. Hence, strengthen challenge appraisal job and control hindrance appraisal job could curb distress at work and underpin life interaction among the teachers.

Keywords: challenge-hindrance job, distress, work-life, workload

Procedia PDF Downloads 187
8728 Evaluation of Short-Term Load Forecasting Techniques Applied for Smart Micro-Grids

Authors: Xiaolei Hu, Enrico Ferrera, Riccardo Tomasi, Claudio Pastrone

Abstract:

Load Forecasting plays a key role in making today's and future's Smart Energy Grids sustainable and reliable. Accurate power consumption prediction allows utilities to organize in advance their resources or to execute Demand Response strategies more effectively, which enables several features such as higher sustainability, better quality of service, and affordable electricity tariffs. It is easy yet effective to apply Load Forecasting at larger geographic scale, i.e. Smart Micro Grids, wherein the lower available grid flexibility makes accurate prediction more critical in Demand Response applications. This paper analyses the application of short-term load forecasting in a concrete scenario, proposed within the EU-funded GreenCom project, which collect load data from single loads and households belonging to a Smart Micro Grid. Three short-term load forecasting techniques, i.e. linear regression, artificial neural networks, and radial basis function network, are considered, compared, and evaluated through absolute forecast errors and training time. The influence of weather conditions in Load Forecasting is also evaluated. A new definition of Gain is introduced in this paper, which innovatively serves as an indicator of short-term prediction capabilities of time spam consistency. Two models, 24- and 1-hour-ahead forecasting, are built to comprehensively compare these three techniques.

Keywords: short-term load forecasting, smart micro grid, linear regression, artificial neural networks, radial basis function network, gain

Procedia PDF Downloads 459
8727 Modern Spectrum Sensing Techniques for Cognitive Radio Networks: Practical Implementation and Performance Evaluation

Authors: Antoni Ivanov, Nikolay Dandanov, Nicole Christoff, Vladimir Poulkov

Abstract:

Spectrum underutilization has made cognitive radio a promising technology both for current and future telecommunications. This is due to the ability to exploit the unused spectrum in the bands dedicated to other wireless communication systems, and thus, increase their occupancy. The essential function, which allows the cognitive radio device to perceive the occupancy of the spectrum, is spectrum sensing. In this paper, the performance of modern adaptations of the four most widely used spectrum sensing techniques namely, energy detection (ED), cyclostationary feature detection (CSFD), matched filter (MF) and eigenvalues-based detection (EBD) is compared. The implementation has been accomplished through the PlutoSDR hardware platform and the GNU Radio software package in very low Signal-to-Noise Ratio (SNR) conditions. The optimal detection performance of the examined methods in a realistic implementation-oriented model is found for the common relevant parameters (number of observed samples, sensing time and required probability of false alarm).

Keywords: cognitive radio, dynamic spectrum access, GNU Radio, spectrum sensing

Procedia PDF Downloads 237
8726 The Risk of In-work Poverty and Family Coping Strategies

Authors: A. Banovcinova, M. Zakova

Abstract:

Labor market activity and paid employment should be a key factor in protecting individuals and families from falling into poverty and providing them with sufficient resources to meet the needs of their members. However, due to various processes in the labor market as well as the influence of individual factors and often insufficient social capital, there is a relatively large group of households that cannot eliminate paid employment and find themselves in a state of so-called working poverty. The aim of the research was to find out what strategies families use in managing poverty and meeting their needs and which of these strategies prevail in the Slovak population. A quantitative research strategy was chosen. The method of data collection was a structured interview focused on finding out the use of individual management strategies and also selected demographic indicators. The research sample consisted of members of families in which at least one member has a paid job. The condition for inclusion in the research was that the family's income did not exceed 60% of the national median equalized disposable income. The analysis of the results showed 5 basic areas to which management strategies are related - work, financial security, needs, social contacts and perception of the current situation. The prevailing strategies were strategies aimed at increasing and streamlining labor market activity and the planned and effective management of the family budget. Strategies that were rejected were mainly related to debt creation. The results make it possible to identify the preferred ways of managing poverty in individual areas of life, as well as the factors that influence this behavior. This information is important for working with families living in a state of working poverty and can help professionals develop positive ways of coping for families.

Keywords: copying strategies, family, in-work poverty, quantitative research

Procedia PDF Downloads 114
8725 Intrusion Detection Using Dual Artificial Techniques

Authors: Rana I. Abdulghani, Amera I. Melhum

Abstract:

With the abnormal growth of the usage of computers over networks and under the consideration or agreement of most of the computer security experts who said that the goal of building a secure system is never achieved effectively, all these points led to the design of the intrusion detection systems(IDS). This research adopts a comparison between two techniques for network intrusion detection, The first one used the (Particles Swarm Optimization) that fall within the field (Swarm Intelligence). In this Act, the algorithm Enhanced for the purpose of obtaining the minimum error rate by amending the cluster centers when better fitness function is found through the training stages. Results show that this modification gives more efficient exploration of the original algorithm. The second algorithm used a (Back propagation NN) algorithm. Finally a comparison between the results of two methods used were based on (NSL_KDD) data sets for the construction and evaluation of intrusion detection systems. This research is only interested in clustering the two categories (Normal and Abnormal) for the given connection records. Practices experiments result in intrude detection rate (99.183818%) for EPSO and intrude detection rate (69.446416%) for BP neural network.

Keywords: IDS, SI, BP, NSL_KDD, PSO

Procedia PDF Downloads 376
8724 Development of National Scale Hydropower Resource Assessment Scheme Using SWAT and Geospatial Techniques

Authors: Rowane May A. Fesalbon, Greyland C. Agno, Jodel L. Cuasay, Dindo A. Malonzo, Ma. Rosario Concepcion O. Ang

Abstract:

The Department of Energy of the Republic of the Philippines estimates that the country’s energy reserves for 2015 are dwindling– observed in the rotating power outages in several localities. To aid in the energy crisis, a national hydropower resource assessment scheme is developed. Hydropower is a resource that is derived from flowing water and difference in elevation. It is a renewable energy resource that is deemed abundant in the Philippines – being an archipelagic country that is rich in bodies of water and water resources. The objectives of this study is to develop a methodology for a national hydropower resource assessment using hydrologic modeling and geospatial techniques in order to generate resource maps for future reference and use of the government and other stakeholders. The methodology developed for this purpose is focused on two models – the implementation of the Soil and Water Assessment Tool (SWAT) for the river discharge and the use of geospatial techniques to analyze the topography and obtain the head, and generate the theoretical hydropower potential sites. The methodology is highly coupled with Geographic Information Systems to maximize the use of geodatabases and the spatial significance of the determined sites. The hydrologic model used in this workflow is SWAT integrated in the GIS software ArcGIS. The head is determined by a developed algorithm that utilizes a Synthetic Aperture Radar (SAR)-derived digital elevation model (DEM) which has a resolution of 10-meters. The initial results of the developed workflow indicate hydropower potential in the river reaches ranging from pico (less than 5 kW) to mini (1-3 MW) theoretical potential.

Keywords: ArcSWAT, renewable energy, hydrologic model, hydropower, GIS

Procedia PDF Downloads 306
8723 The Journey of a Malicious HTTP Request

Authors: M. Mansouri, P. Jaklitsch, E. Teiniker

Abstract:

SQL injection on web applications is a very popular kind of attack. There are mechanisms such as intrusion detection systems in order to detect this attack. These strategies often rely on techniques implemented at high layers of the application but do not consider the low level of system calls. The problem of only considering the high level perspective is that an attacker can circumvent the detection tools using certain techniques such as URL encoding. One technique currently used for detecting low-level attacks on privileged processes is the tracing of system calls. System calls act as a single gate to the Operating System (OS) kernel; they allow catching the critical data at an appropriate level of detail. Our basic assumption is that any type of application, be it a system service, utility program or Web application, “speaks” the language of system calls when having a conversation with the OS kernel. At this level we can see the actual attack while it is happening. We conduct an experiment in order to demonstrate the suitability of system call analysis for detecting SQL injection. We are able to detect the attack. Therefore we conclude that system calls are not only powerful in detecting low-level attacks but that they also enable us to detect high-level attacks such as SQL injection.

Keywords: Linux system calls, web attack detection, interception, SQL

Procedia PDF Downloads 351
8722 Investigating the Contribution of Road Construction on Soil Erosion, a Case Study of Engcobo Local Municipality, Chris Hani District, South Africa

Authors: Yamkela Zitwana

Abstract:

Soil erosion along the roads and/or road riparian areas has become a norm in the Eastern Cape. Soil erosion refers to the detachment and transportation of soil from one area (onsite) to another (offsite). This displacement or removal of soil can be caused by water, air and sometimes gravity. This will focus on accelerated soil erosion which is the result of human interference with the environment. Engcobo local municipality falls within the Eastern Cape Province in the eastern side of CHRIS HANI District municipality. The focus road is R61 protruding from the Engcobo town outskirts along the Nyanga SSS on the way to Umtata although it will cover few Kilometers away from Engcobo. This research aims at looking at the contribution made by road construction to soil erosion. Steps to achieve the result will involve revisiting the phases of road construction through unstructured interviews, identifying the types of soil erosion evident in the area by doing a checklist, checking the material, utensils and equipment used for road construction and the contribution of road construction through stratified random sampling checking the soil color and texture. This research will use a pragmatic approach which combines related methods and consider the flaws of each method so as to ensure validity, precision and accuracy. Both qualitative and quantitative methods will be used. Statistical methods and GIS analysis will be used to analyze the collected data.

Keywords: soil erosion, road riparian, accelerated soil erosion, road construction, sampling, universal soil loss model, GIS analysis, focus groups, qualitative, quantitative method, research, checklist questionnaires, unstructured interviews, pragmatic approach

Procedia PDF Downloads 384
8721 Concept Drifts Detection and Localisation in Process Mining

Authors: M. V. Manoj Kumar, Likewin Thomas, Annappa

Abstract:

Process mining provides methods and techniques for analyzing event logs recorded in modern information systems that support real-world operations. While analyzing an event-log, state-of-the-art techniques available in process mining believe that the operational process as a static entity (stationary). This is not often the case due to the possibility of occurrence of a phenomenon called concept drift. During the period of execution, the process can experience concept drift and can evolve with respect to any of its associated perspectives exhibiting various patterns-of-change with a different pace. Work presented in this paper discusses the main aspects to consider while addressing concept drift phenomenon and proposes a method for detecting and localizing the sudden concept drifts in control-flow perspective of the process by using features extracted by processing the traces in the process log. Our experimental results are promising in the direction of efficiently detecting and localizing concept drift in the context of process mining research discipline.

Keywords: abrupt drift, concept drift, sudden drift, control-flow perspective, detection and localization, process mining

Procedia PDF Downloads 341
8720 Exhaustive Study of Essential Constraint Satisfaction Problem Techniques Based on N-Queens Problem

Authors: Md. Ahsan Ayub, Kazi A. Kalpoma, Humaira Tasnim Proma, Syed Mehrab Kabir, Rakib Ibna Hamid Chowdhury

Abstract:

Constraint Satisfaction Problem (CSP) is observed in various applications, i.e., scheduling problems, timetabling problems, assignment problems, etc. Researchers adopt a CSP technique to tackle a certain problem; however, each technique follows different approaches and ways to solve a problem network. In our exhaustive study, it has been possible to visualize the processes of essential CSP algorithms from a very concrete constraint satisfaction example, NQueens Problem, in order to possess a deep understanding about how a particular constraint satisfaction problem will be dealt with by our studied and implemented techniques. Besides, benchmark results - time vs. value of N in N-Queens - have been generated from our implemented approaches, which help understand at what factor each algorithm produces solutions; especially, in N-Queens puzzle. Thus, extended decisions can be made to instantiate a real life problem within CSP’s framework.

Keywords: arc consistency (AC), backjumping algorithm (BJ), backtracking algorithm (BT), constraint satisfaction problem (CSP), forward checking (FC), least constrained values (LCV), maintaining arc consistency (MAC), minimum remaining values (MRV), N-Queens problem

Procedia PDF Downloads 357
8719 Computer-Aided Diagnosis System Based on Multiple Quantitative Magnetic Resonance Imaging Features in the Classification of Brain Tumor

Authors: Chih Jou Hsiao, Chung Ming Lo, Li Chun Hsieh

Abstract:

Brain tumor is not the cancer having high incidence rate, but its high mortality rate and poor prognosis still make it as a big concern. On clinical examination, the grading of brain tumors depends on pathological features. However, there are some weak points of histopathological analysis which can cause misgrading. For example, the interpretations can be various without a well-known definition. Furthermore, the heterogeneity of malignant tumors is a challenge to extract meaningful tissues under surgical biopsy. With the development of magnetic resonance imaging (MRI), tumor grading can be accomplished by a noninvasive procedure. To improve the diagnostic accuracy further, this study proposed a computer-aided diagnosis (CAD) system based on MRI features to provide suggestions of tumor grading. Gliomas are the most common type of malignant brain tumors (about 70%). This study collected 34 glioblastomas (GBMs) and 73 lower-grade gliomas (LGGs) from The Cancer Imaging Archive. After defining the region-of-interests in MRI images, multiple quantitative morphological features such as region perimeter, region area, compactness, the mean and standard deviation of the normalized radial length, and moment features were extracted from the tumors for classification. As results, two of five morphological features and three of four image moment features achieved p values of <0.001, and the remaining moment feature had p value <0.05. Performance of the CAD system using the combination of all features achieved the accuracy of 83.18% in classifying the gliomas into LGG and GBM. The sensitivity is 70.59% and the specificity is 89.04%. The proposed system can become a second viewer on clinical examinations for radiologists.

Keywords: brain tumor, computer-aided diagnosis, gliomas, magnetic resonance imaging

Procedia PDF Downloads 251
8718 The Ethics of Corporate Social Responsibility Statements in Undercutting Sustainability: A Communication Perspective

Authors: Steven Woods

Abstract:

The use of Corporate Social Responsibility Statements has become ubiquitous in society. The appeal to consumers by being a well-behaved social entity has become a strategy not just to ensure brand loyalty but also to further larger scale projects of corporate interests. Specifically, the use of CSR to position corporations as good planetary citizens involves not just self-promotion but also a way of transferring responsibility from systems to individuals. By using techniques labeled as “greenwashing” and emphasizing ethical consumption choices as the solution, corporations present themselves as good members of the community and pursuing sustainability. Ultimately, the primary function of Corporate Social Responsibility statements is to maintain the economic status quo of ongoing growth and consumption while presenting and environmentally progressive image to the public, as well as reassuring them corporate behavior is superior to government intervention. By analyzing the communication techniques utilized through content analysis of specific examples, along with an analysis of the frames of meaning constructed in the CSR statements, the practices of Corporate Responsibility and Sustainability will be addressed from an ethical perspective.

Keywords: corporate social responsibility, ethics, greenwashing, sustainability

Procedia PDF Downloads 67
8717 Using Data Mining Techniques to Evaluate the Different Factors Affecting the Academic Performance of Students at the Faculty of Information Technology in Hashemite University in Jordan

Authors: Feras Hanandeh, Majdi Shannag

Abstract:

This research studies the different factors that could affect the Faculty of Information Technology in Hashemite University students’ accumulative average. The research paper verifies the student information, background, their academic records, and how this information will affect the student to get high grades. The student information used in the study is extracted from the student’s academic records. The data mining tools and techniques are used to decide which attribute(s) will affect the student’s accumulative average. The results show that the most important factor which affects the students’ accumulative average is the student Acceptance Type. And we built a decision tree model and rules to determine how the student can get high grades in their courses. The overall accuracy of the model is 44% which is accepted rate.

Keywords: data mining, classification, extracting rules, decision tree

Procedia PDF Downloads 409
8716 Comparing Business Excellence Models Using Quantitative Methods: A First Step

Authors: Mohammed Alanazi, Dimitrios Tsagdis

Abstract:

Established Business Excellence Models (BEMs), like the Malcolm Baldrige National Quality Award (MBNQA) model and the European Foundation for Quality Management (EFQM) model, have been adopted by firms all over the world. They exist alongside more recent country-specific BEMs; e.g. the Australian, Canadian, China, New Zealand, Singapore, and Taiwan quality awards that although not as widespread as MBNQA and EFQM have nonetheless strong national followings. Regardless of any differences in their following or prestige, the emergence and development of all BEMs have been shaped both by their local context (e.g. underlying socio-economic dynamics) as well as by global best practices. Besides such similarities, that render them into objects (i.e. models) of the same class (i.e. BEMs), BEMs exhibit non-trivial differences in their criteria, relations, and emphasis. Given the evolution of BEMs (e.g. the MBNQA underwent seven evolutions since its inception in 1987 while the EFQM five since 1993), it is unsurprising that comparative studies of their validity are few and far in between. This poses challenges for practitioners and policy makers alike; as it is not always clear which BEM is to be preferred or better fitting to a particular context. Especially, in contexts that differ substantially from the original context of BEM development. This paper aims to fill this gap by presenting a research design and measurement model for comparing BEMs using quantitative methods (e.g. structural equations). Three BEMs will be focused upon in particular for illustration purposes; the MBNQA, the EFQM, and the King Abdul Aziz Quality Award (KAQA) model. They have been selected so to reflect the two established and widely spread traditions as well as a more recent context-specific arrival promising a better fit.

Keywords: Baldrige, business excellence, European Foundation for Quality Management, Structural Equation Model, total quality management

Procedia PDF Downloads 233
8715 [Keynote Talk]: Software Reliability Assessment and Fault Tolerance: Issues and Challenges

Authors: T. Gayen

Abstract:

Although, there are several software reliability models existing today there does not exist any versatile model even today which can be used for the reliability assessment of software. Complex software has a large number of states (unlike the hardware) so it becomes practically difficult to completely test the software. Irrespective of the amount of testing one does, sometimes it becomes extremely difficult to assure that the final software product is fault free. The Black Box Software Reliability models are found be quite uncertain for the reliability assessment of various systems. As mission critical applications need to be highly reliable and since it is not always possible to ensure the development of highly reliable system. Hence, in order to achieve fault-free operation of software one develops some mechanism to handle faults remaining in the system even after the development. Although, several such techniques are currently in use to achieve fault tolerance, yet these mechanisms may not always be very suitable for various systems. Hence, this discussion is focused on analyzing the issues and challenges faced with the existing techniques for reliability assessment and fault tolerance of various software systems.

Keywords: black box, fault tolerance, failure, software reliability

Procedia PDF Downloads 422
8714 Secret Sharing in Visual Cryptography Using NVSS and Data Hiding Techniques

Authors: Misha Alexander, S. B. Waykar

Abstract:

Visual Cryptography is a special unbreakable encryption technique that transforms the secret image into random noisy pixels. These shares are transmitted over the network and because of its noisy texture it attracts the hackers. To address this issue a Natural Visual Secret Sharing Scheme (NVSS) was introduced that uses natural shares either in digital or printed form to generate the noisy secret share. This scheme greatly reduces the transmission risk but causes distortion in the retrieved secret image through variation in settings and properties of digital devices used to capture the natural image during encryption / decryption phase. This paper proposes a new NVSS scheme that extracts the secret key from randomly selected unaltered multiple natural images. To further improve the security of the shares data hiding techniques such as Steganography and Alpha channel watermarking are proposed.

Keywords: decryption, encryption, natural visual secret sharing, natural images, noisy share, pixel swapping

Procedia PDF Downloads 397
8713 Achieving Success in NPD Projects

Authors: Ankush Agrawal, Nadia Bhuiyan

Abstract:

The new product development (NPD) literature emphasizes the importance of introducing new products on the market for continuing business success. New products are responsible for employment, economic growth, technological progress, and high standards of living. Therefore, the study of NPD and the processes through which they emerge is important. The goal of our research is to propose a framework of critical success factors, metrics, and tools and techniques for implementing metrics for each stage of the new product development (NPD) process. An extensive literature review was undertaken to investigate decades of studies on NPD success and how it can be achieved. These studies were scanned for common factors for firms that enjoyed success of new products on the market. The paper summarizes NPD success factors, suggests metrics that should be used to measure these factors, and proposes tools and techniques to make use of these metrics. This was done for each stage of the NPD process, and brought together in a framework that the authors propose should be followed for complex NPD projects. While many studies have been conducted on critical success factors for NPD, these studies tend to be fragmented and focus on one or a few phases of the NPD process.

Keywords: new product development, performance, critical success factors, framework

Procedia PDF Downloads 395
8712 Special Features Of Phacoemulsification Technique For Dense Cataracts

Authors: Shilkin A.G., Goncharov D.V., Rotanov D.A., Voitecha M.A., Kulyagina Y.I., Mochalova U.E.

Abstract:

Context: Phacoemulsification is a surgical technique used to remove cataracts, but it has a higher number of complications when dense cataracts are present. The risk factors include thin posterior capsule, dense nucleus fragments, and prolonged exposure to high-power ultrasound. To minimize these complications, various methods are used. Research aim: The aim of this study is to develop and implement optimal methods of ultrasound phacoemulsification for dense cataracts in order to minimize postoperative complications. Methodology: The study involved 36 eyes of dogs with dense cataracts over a period of 5 years. The surgeries were performed using a LEICA 844 surgical microscope and an Oertli Faros phacoemulsifier. The surgical techniques included the optimal technique for breaking the nucleus, bimanual surgery, and the use of Akahoshi prechoppers. Findings: The complications observed during the surgery included rupture of the posterior capsule and the need for anterior vitrectomy. Complications in the postoperative period included corneal edema and uveitis. Theoretical importance: This study contributes to the field by providing insights into the special features of phacoemulsification for dense cataracts. It highlights the importance of using specific techniques and settings to minimize complications. Data collection and analysis procedures: The data for the study were collected from surgeries performed on dogs with dense cataracts. The complications were documented and analyzed. Question addressed: The study addressed the question of how to minimize complications during phacoemulsification surgery for dense cataracts. Conclusion: By following the optimal techniques, settings, and using prechoppers, the surgery for dense cataracts can be made safer and faster, minimizing the risks and complications.

Keywords: dense cataracts, phacoemulsification, phacoemulsification of cataracts in elderly dogs, осложнения факоэмульсификации

Procedia PDF Downloads 56
8711 Cryptic Diversity: Identifying Two Morphologically Similar Species of Invasive Apple Snails in Peninsular Malaysia

Authors: Suganiya Rama Rao, Yoon-Yen Yow, Thor-Seng Liew, Shyamala Ratnayeke

Abstract:

Invasive snails in the genus Pomacea have spread across Southeast Asia including Peninsular Malaysia. Apart from significant economic costs to wetland crops, very little is known about the snails’ effects on native species, and wetland function through their alteration of macrophyte communities. This study was conducted to establish diagnostic characteristics of Pomacea species in the Malaysian environment using genetic and morphological criteria. Snails were collected from eight localities in northern and central regions of Peninsular Malaysia. The mitochondrial COI gene of 52 adult snails was amplified and sequenced. Maximum likelihood analysis was used to analyse species identity and assess phylogenetic relationships among snails from different geographic locations. Shells of the two species were compared using geometric morphometric analysis and covariance analyses. Shell height accounted for most of the observed variation between P. canaliculata and P. maculata, with the latter possessing a smaller mean ratio of shell height: aperture height (p < 0.0001) and shell height to shell width (give p < 0.0001). Genomic and phylogenetic analysis demonstrated the presence of two monophyletic taxa, P. canaliculata and P. maculata, in Peninsular Malaysia samples. P. maculata co-occurred with P. canaliculata in 5 localities, but samples from 3 localities contained only P. canaliculata. This study is the first to confirm the presence of two of the most invasive species of Pomacea in Peninsular Malaysia using a genomic approach. P. canaliculata appears to be the more widespread species. Despite statistical differences, both quantitative and qualitative morphological characteristics demonstrate much interspecific overlap and intraspecific variability; thus morphology alone cannot reliably verify species identity. Molecular techniques for distinguishing between these two highly invasive Pomacea species are needed to understand their specific ecological niches and develop effective protocols for their management.

Keywords: Pomacea canaliculata, Pomacea maculata, invasive species, phylog enetic analysis, geometric morphometric analysis

Procedia PDF Downloads 254
8710 Recent Developments in the Application of Deep Learning to Stock Market Prediction

Authors: Shraddha Jain Sharma, Ratnalata Gupta

Abstract:

Predicting stock movements in the financial market is both difficult and rewarding. Analysts and academics are increasingly using advanced approaches such as machine learning techniques to anticipate stock price patterns, thanks to the expanding capacity of computing and the recent advent of graphics processing units and tensor processing units. Stock market prediction is a type of time series prediction that is incredibly difficult to do since stock prices are influenced by a variety of financial, socioeconomic, and political factors. Furthermore, even minor mistakes in stock market price forecasts can result in significant losses for companies that employ the findings of stock market price prediction for financial analysis and investment. Soft computing techniques are increasingly being employed for stock market prediction due to their better accuracy than traditional statistical methodologies. The proposed research looks at the need for soft computing techniques in stock market prediction, the numerous soft computing approaches that are important to the field, past work in the area with their prominent features, and the significant problems or issue domain that the area involves. For constructing a predictive model, the major focus is on neural networks and fuzzy logic. The stock market is extremely unpredictable, and it is unquestionably tough to correctly predict based on certain characteristics. This study provides a complete overview of the numerous strategies investigated for high accuracy prediction, with a focus on the most important characteristics.

Keywords: stock market prediction, artificial intelligence, artificial neural networks, fuzzy logic, accuracy, deep learning, machine learning, stock price, trading volume

Procedia PDF Downloads 85
8709 Employing Visual Culture to Enhance Initial Adult Maltese Language Acquisition

Authors: Jacqueline Żammit

Abstract:

Recent research indicates that the utilization of right-brain strategies holds significant implications for the acquisition of language skills. Nevertheless, the utilization of visual culture as a means to stimulate these strategies and amplify language retention among adults engaging in second language (L2) learning remains a relatively unexplored area. This investigation delves into the impact of visual culture on activating right-brain processes during the initial stages of language acquisition, particularly in the context of teaching Maltese as a second language (ML2) to adult learners. By employing a qualitative research approach, this study convenes a focus group comprising twenty-seven educators to delve into a range of visual culture techniques integrated within language instruction. The collected data is subjected to thematic analysis using NVivo software. The findings underscore a variety of impactful visual culture techniques, encompassing activities such as drawing, sketching, interactive matching games, orthographic mapping, memory palace strategies, wordless picture books, picture-centered learning methodologies, infographics, Face Memory Game, Spot the Difference, Word Search Puzzles, the Hidden Object Game, educational videos, the Shadow Matching technique, Find the Differences exercises, and color-coded methodologies. These identified techniques hold potential for application within ML2 classes for adult learners. Consequently, this study not only provides insights into optimizing language learning through specific visual culture strategies but also furnishes practical recommendations for enhancing language competencies and skills.

Keywords: visual culture, right-brain strategies, second language acquisition, maltese as a second language, visual aids, language-based activities

Procedia PDF Downloads 54
8708 Mending Broken Fences Policing: Developing the Intelligence-Led/Community-Based Policing Model(IP-CP) and Quality/Quantity/Crime(QQC) Model

Authors: Anil Anand

Abstract:

Despite enormous strides made during the past decade, particularly with the adoption and expansion of community policing, there remains much that police leaders can do to improve police-public relations. The urgency is particularly evident in cities across the United States and Europe where an increasing number of police interactions over the past few years have ignited large, sometimes even national, protests against police policy and strategy, highlighting a gap between what police leaders feel they have archived in terms of public satisfaction, support, and legitimacy and the perception of bias among many marginalized communities. The decision on which one policing strategy is chosen over another, how many resources are allocated, and how strenuously the policy is applied resides primarily with the police and the units and subunits tasked with its enforcement. The scope and opportunity for police officers in impacting social attitudes and social policy are important elements that cannot be overstated. How do police leaders, for instance, decide when to apply one strategy—say community-based policing—over another, like intelligence-led policing? How do police leaders measure performance and success? Should these measures be based on quantitative preferences over qualitative, or should the preference be based on some other criteria? And how do police leaders define, allow, and control discretionary decision-making? Mending Broken Fences Policing provides police and security services leaders with a model based on social cohesion, that incorporates intelligence-led and community policing (IP-CP), supplemented by a quality/quantity/crime (QQC) framework to provide a four-step process for the articulable application of police intervention, performance measurement, and application of discretion.

Keywords: social cohesion, quantitative performance measurement, qualitative performance measurement, sustainable leadership

Procedia PDF Downloads 289
8707 Carbohydrate-Based Recommendations as a Basis for Dietary Guidelines

Authors: A. E. Buyken, D. J. Mela, P. Dussort, I. T. Johnson, I. A. Macdonald, A. Piekarz, J. D. Stowell, F. Brouns

Abstract:

Recently a number of renewed dietary guidelines have been published by various health authorities. The aim of the present work was 1) to review the processes (systematic approach/review, inclusion of public consultation) and methodological approaches used to identify and select the underpinning evidence base for the established recommendations for total carbohydrate (CHO), fiber and sugar consumption, and 2) examine how differences in the methods and processes applied may have influenced the final recommendations. A search of WHO, US, Canada, Australia and European sources identified 13 authoritative dietary guidelines with the desired detailed information. Each of these guidelines was evaluated for its scientific basis (types and grading of the evidence) and the processes by which the guidelines were developed Based on the data retrieved the following conclusions can be drawn: 1) Generally, a relatively high total CHO and fiber intake and limited intake of sugars (added or free) is recommended. 2) Even where recommendations are quite similar, the specific, justifications for quantitative/qualitative recommendations differ across authorities. 3) Differences appear to be due to inconsistencies in underlying definitions of CHO exposure and in the concurrent appraisal of CHO-providing foods and nutrients as well the choice and number of health outcomes selected for the evidence appraisal. 4) Differences in the selected articles, time frames or data aggregation method appeared to be of rather minor influence. From this assessment, the main recommendations are for: 1) more explicit quantitative justifications for numerical guidelines and communication of uncertainty; and 2) greater international harmonization, particularly with regard to underlying definitions of exposures and range of relevant nutrition-related outcomes.

Keywords: carbohydrates, dietary fibres, dietary guidelines, recommendations, sugars

Procedia PDF Downloads 252
8706 ViraPart: A Text Refinement Framework for Automatic Speech Recognition and Natural Language Processing Tasks in Persian

Authors: Narges Farokhshad, Milad Molazadeh, Saman Jamalabbasi, Hamed Babaei Giglou, Saeed Bibak

Abstract:

The Persian language is an inflectional subject-object-verb language. This fact makes Persian a more uncertain language. However, using techniques such as Zero-Width Non-Joiner (ZWNJ) recognition, punctuation restoration, and Persian Ezafe construction will lead us to a more understandable and precise language. In most of the works in Persian, these techniques are addressed individually. Despite that, we believe that for text refinement in Persian, all of these tasks are necessary. In this work, we proposed a ViraPart framework that uses embedded ParsBERT in its core for text clarifications. First, used the BERT variant for Persian followed by a classifier layer for classification procedures. Next, we combined models outputs to output cleartext. In the end, the proposed model for ZWNJ recognition, punctuation restoration, and Persian Ezafe construction performs the averaged F1 macro scores of 96.90%, 92.13%, and 98.50%, respectively. Experimental results show that our proposed approach is very effective in text refinement for the Persian language.

Keywords: Persian Ezafe, punctuation, ZWNJ, NLP, ParsBERT, transformers

Procedia PDF Downloads 206