Search results for: proportional representation
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1725

Search results for: proportional representation

885 Analysing Social Media Coverage of Political Speeches in Relation to Discourse and Context

Authors: Yaser Mohammed Altameemi

Abstract:

This research looks at the representation of the social media for the Saudi Government decrees regarding the developmental projects of the Saudi 2030 vision. The paper analyses a television interview with the Crown Prince Mohammed Bin Salman who talks about the progress of the Saudi vision of 2030, and how the government had acted as response to the COVID-19 pandemic. The interview was on 28/4/2021. The paper analyses the tweets on Twitter that cover the interview for the purpose of investigating the development of concepts and meanings regarding the Saudi peoples’ orientations towards the Saudi projects. The data include all related tweets from the day of the interview and the following seven days after the interview. The finding of the collocation analysis suggests that nationalism notion is explicitly expressed by users in Twitter. The main finding of this paper suggests the importance of further analyses for the concordance lines. However, the collocation network suggests that there is a clear highlight for nationalism.

Keywords: social media, twitter, political interview, prince Mohammed Bin Salman, Saudi vision 2030

Procedia PDF Downloads 192
884 Polymeric Microspheres for Bone Tissue Engineering

Authors: Yamina Boukari, Nashiru Billa, Andrew Morris, Stephen Doughty, Kevin Shakesheff

Abstract:

Poly (lactic-co-glycolic) acid (PLGA) is a synthetic polymer that can be used in bone tissue engineering with the aim of creating a scaffold in order to support the growth of cells. The formation of microspheres from this polymer is an attractive strategy that would allow for the development of an injectable system, hence avoiding invasive surgical procedures. The aim of this study was to develop a microsphere delivery system for use as an injectable scaffold in bone tissue engineering and evaluate various formulation parameters on its properties. Porous and lysozyme-containing PLGA microspheres were prepared using the double emulsion solvent evaporation method from various molecular weights (MW). Scaffolds were formed by sintering to contain 1 -3mg of lysozyme per gram of scaffold. The mechanical and physical properties of the scaffolds were assessed along with the release of lysozyme, which was used as a model protein. The MW of PLGA was found to have an influence on microsphere size during fabrication, with increased MW leading to an increased microsphere diameter. An inversely proportional relationship was displayed between PLGA MW and mechanical strength of formed scaffolds across loadings for low, intermediate and high MW respectively. Lysozyme release from both microspheres and formed scaffolds showed an initial burst release phase, with both microspheres and scaffolds fabricated using high MW PLGA showing the lowest protein release. Following the initial burst phase, the profiles for each MW followed a similar slow release over 30 days. Overall, the results of this study demonstrate that lysozyme can be successfully incorporated into porous PLGA scaffolds and released over 30 days in vitro, and that varying the MW of the PLGA can be used as a method of altering the physical properties of the resulting scaffolds.

Keywords: bone, microspheres, PLGA, tissue engineering

Procedia PDF Downloads 425
883 Threshold Concepts in TESOL: A Thematic Analysis of Disciplinary Guiding Principles

Authors: Neil Morgan

Abstract:

The notion of Threshold Concepts has offered a fertile new perspective on the transformative effects of mastery of particular concepts on student understanding of subject matter and their developing identities as inductees into disciplinary discourse communities. Only by successfully traversing key knowledge thresholds, it is claimed, can neophytes gain access to the more sophisticated understandings of subject matter possessed by mature members of a discipline. This paper uses thematic analysis of disciplinary guiding principles to identify nine candidate Threshold Concepts that appear to underpin effective TESOL practice. The relationship between these candidate TESOL Threshold Concepts, TESOL principles, and TESOL instructional techniques appears to be amenable to a schematic representation based on superordinate categories of TESOL practitioner concern and, as such, offers an alternative to the view of Threshold Concepts as a privileged subset of disciplinary core concepts. The paper concludes by exploring the potential of a Threshold Concepts framework to productively inform TESOL initial teacher education (ITE) and in-service education and training (INSET).

Keywords: TESOL, threshold concepts, TESOL principles, TESOL ITE/INSET, community of practice

Procedia PDF Downloads 143
882 Hierarchical Tree Long Short-Term Memory for Sentence Representations

Authors: Xiuying Wang, Changliang Li, Bo Xu

Abstract:

A fixed-length feature vector is required for many machine learning algorithms in NLP field. Word embeddings have been very successful at learning lexical information. However, they cannot capture the compositional meaning of sentences, which prevents them from a deeper understanding of language. In this paper, we introduce a novel hierarchical tree long short-term memory (HTLSTM) model that learns vector representations for sentences of arbitrary syntactic type and length. We propose to split one sentence into three hierarchies: short phrase, long phrase and full sentence level. The HTLSTM model gives our algorithm the potential to fully consider the hierarchical information and long-term dependencies of language. We design the experiments on both English and Chinese corpus to evaluate our model on sentiment analysis task. And the results show that our model outperforms several existing state of the art approaches significantly.

Keywords: deep learning, hierarchical tree long short-term memory, sentence representation, sentiment analysis

Procedia PDF Downloads 349
881 Digital Musical Organology: The Audio Games: The Question of “A-Musicological” Interfaces

Authors: Hervé Zénouda

Abstract:

This article seeks to shed light on an emerging creative field: "Audio games," at the crossroads between video games and computer music. Indeed, many applications, which propose entertaining audio-visual experiences with the objective of musical creation, are available today for different supports (game consoles, computers, cell phones). The originality of this field is the use of the gameplay of video games applied to music composition. Thus, composing music using interfaces but also cognitive logics that we qualify as "a-musicological" seem to us particularly interesting from the perspective of musical digital organology. This field raises questions about the representation of sound and musical structures and develops new instrumental gestures and strategies of musical composition. We will try in this article to define the characteristics of this field by highlighting some historical milestones (abstract cinema, game theory in music, actions, and graphic scores) as well as the novelties brought by digital technologies.

Keywords: audio-games, video games, computer generated music, gameplay, interactivity, synesthesia, sound interfaces, relationships image/sound, audiovisual music

Procedia PDF Downloads 113
880 Cooperative Cross Layer Topology for Concurrent Transmission Scheduling Scheme in Broadband Wireless Networks

Authors: Gunasekaran Raja, Ramkumar Jayaraman

Abstract:

In this paper, we consider CCL-N (Cooperative Cross Layer Network) topology based on the cross layer (both centralized and distributed) environment to form network communities. Various performance metrics related to the IEEE 802.16 networks are discussed to design CCL-N Topology. In CCL-N topology, nodes are classified as master nodes (Master Base Station [MBS]) and serving nodes (Relay Station [RS]). Nodes communities are organized based on the networking terminologies. Based on CCL-N Topology, various simulation analyses for both transparent and non-transparent relays are tabulated and throughput efficiency is calculated. Weighted load balancing problem plays a challenging role in IEEE 802.16 network. CoTS (Concurrent Transmission Scheduling) Scheme is formulated in terms of three aspects – transmission mechanism based on identical communities, different communities and identical node communities. CoTS scheme helps in identifying the weighted load balancing problem. Based on the analytical results, modularity value is inversely proportional to that of the error value. The modularity value plays a key role in solving the CoTS problem based on hop count. The transmission mechanism for identical node community has no impact since modularity value is same for all the network groups. In this paper three aspects of communities based on the modularity value which helps in solving the problem of weighted load balancing and CoTS are discussed.

Keywords: cross layer network topology, concurrent scheduling, modularity value, network communities and weighted load balancing

Procedia PDF Downloads 266
879 The Impact of Gamification on Self-Assessment for English Language Learners in Saudi Arabia

Authors: Wala A. Bagunaid, Maram Meccawy, Arwa Allinjawi, Zilal Meccawy

Abstract:

Continuous self-assessment becomes crucial in self-paced online learning environments. Students often depend on themselves to assess their progress; which is considered an essential requirement for any successful learning process. Today’s education institutions face major problems around student motivation and engagement. Thus, personalized e-learning systems aim to help and guide the students. Gamification provides an opportunity to help students for self-assessment and social comparison with other students through attempting to harness the motivational power of games and apply it to the learning environment. Furthermore, Open Social Student Modeling (OSSM) as considered as the latest user modeling technologies is believed to improve students’ self-assessment and to allow them to social comparison with other students. This research integrates OSSM approach and gamification concepts in order to provide self-assessment for English language learners at King Abdulaziz University (KAU). This is achieved through an interactive visual representation of their learning progress.

Keywords: e-learning system, gamification, motivation, social comparison, visualization

Procedia PDF Downloads 154
878 FISCEAPP: FIsh Skin Color Evaluation APPlication

Authors: J. Urban, Á. S. Botella, L. E. Robaina, A. Bárta, P. Souček, P. Císař, Š. Papáček, L. M. Domínguez

Abstract:

Skin coloration in fish is of great physiological, behavioral and ecological importance and can be considered as an index of animal welfare in aquaculture as well as an important quality factor in the retail value. Currently, in order to compare color in animals fed on different diets, biochemical analysis, and colorimetry of fished, mildly anesthetized or dead body, are very accurate and meaningful measurements. The noninvasive method using digital images of the fish body was developed as a standalone application. This application deals with the computation burden and memory consumption of large input files, optimizing piece wise processing and analysis with the memory/computation time ratio. For the comparison of color distributions of various experiments and different color spaces (RGB, CIE L*a*b*) the comparable semi-equidistant binning of multi channels representation is introduced. It is derived from the knowledge of quantization levels and Freedman-Diaconis rule. The color calibrations and camera responsivity function were necessary part of the measurement process.

Keywords: color distribution, fish skin color, piecewise transformation, object to background segmentation

Procedia PDF Downloads 263
877 Distributed Perceptually Important Point Identification for Time Series Data Mining

Authors: Tak-Chung Fu, Ying-Kit Hung, Fu-Lai Chung

Abstract:

In the field of time series data mining, the concept of the Perceptually Important Point (PIP) identification process is first introduced in 2001. This process originally works for financial time series pattern matching and it is then found suitable for time series dimensionality reduction and representation. Its strength is on preserving the overall shape of the time series by identifying the salient points in it. With the rise of Big Data, time series data contributes a major proportion, especially on the data which generates by sensors in the Internet of Things (IoT) environment. According to the nature of PIP identification and the successful cases, it is worth to further explore the opportunity to apply PIP in time series ‘Big Data’. However, the performance of PIP identification is always considered as the limitation when dealing with ‘Big’ time series data. In this paper, two distributed versions of PIP identification based on the Specialized Binary (SB) Tree are proposed. The proposed approaches solve the bottleneck when running the PIP identification process in a standalone computer. Improvement in term of speed is obtained by the distributed versions.

Keywords: distributed computing, performance analysis, Perceptually Important Point identification, time series data mining

Procedia PDF Downloads 435
876 Governance Question and the Participatory Policy Making: Making the Process Functional in Nigeria

Authors: Albert T. Akume, P. D. Dahida

Abstract:

This paper examines the effect of various epochs of governments on policy making in Nigeria. The character of governance and public policy making of both epochs was exclusive, non-participatory and self-centric. As a consequence the interests of citizenry were not represented, neither protected nor sought to meet fairly the needs of all groups. The introduction of the post-1999 democratic government demand that the hitherto skewed pattern of policy making cease to be a character of governance. Hence, the need for citizen participation in the policy making process. The question then is what mode is most appropriate to engender public participation so as to make the policy making process functional? Given the prevailing social, economic and political dilemmas the utilization of the direct mode of citizen participation to affect policy outcome is doubtful if not unattainable. It is due to these predicament that this paper uses the documentary research design argues for the utilization of the indirect mode of citizen participation in the policy making process so as to affect public policy outcome appropriately and with less cost, acrimony and delays.

Keywords: governance, public policy, participation, representation, civil society

Procedia PDF Downloads 376
875 Computational Methods in Official Statistics with an Example on Calculating and Predicting Diabetes Mellitus [DM] Prevalence in Different Age Groups within Australia in Future Years, in Light of the Aging Population

Authors: D. Hilton

Abstract:

An analysis of the Australian Diabetes Screening Study estimated undiagnosed diabetes mellitus [DM] prevalence in a high risk general practice based cohort. DM prevalence varied from 9.4% to 18.1% depending upon the diagnostic criteria utilised with age being a highly significant risk factor. Utilising the gold standard oral glucose tolerance test, the prevalence of DM was 22-23% in those aged >= 70 years and <15% in those aged 40-59 years. Opportunistic screening in Australian general practice potentially can identify many persons with undiagnosed type 2 DM. An Australian Bureau of Statistics document published three years ago, reported the highest rate of DM in men aged 65-74 years [19%] whereas the rate for women was highest in those over 75 years [13%]. If you consider that the Australian Bureau of Statistics report in 2007 found that 13% of the population was over 65 years of age and that this will increase to 23-25% by 2056 with a further projected increase to 25-28% by 2101, obviously this information has to be factored into the equation when age related diabetes prevalence predictions are calculated. This 10-15% proportional increase of elderly persons within the population demographics has dramatic implications for the estimated number of elderly persons with DM in these age groupings. Computational methodology showing the age related demographic changes reported in these official statistical documents will be done showing estimates for 2056 and 2101 for different age groups. This has relevance for future diabetes prevalence rates and shows that along with many countries worldwide Australia is facing an increasing pandemic. In contrast Japan is expected to have a decrease in the next twenty years in the number of persons with diabetes.

Keywords: epidemiological methods, aging, prevalence, diabetes mellitus

Procedia PDF Downloads 375
874 Provenance in Scholarly Publications: Introducing the provCite Ontology

Authors: Maria Joseph Israel, Ahmed Amer

Abstract:

Our work aims to broaden the application of provenance technology beyond its traditional domains of scientific workflow management and database systems by offering a general provenance framework to capture richer and extensible metadata in unstructured textual data sources such as literary texts, commentaries, translations, and digital humanities. Specifically, we demonstrate the feasibility of capturing and representing expressive provenance metadata, including more of the context for citing scholarly works (e.g., the authors’ explicit or inferred intentions at the time of developing his/her research content for publication), while also supporting subsequent augmentation with similar additional metadata (by third parties, be they human or automated). To better capture the nature and types of possible citations, in our proposed provenance scheme metaScribe, we extend standard provenance conceptual models to form our proposed provCite ontology. This provides a conceptual framework which can accurately capture and describe more of the functional and rhetorical properties of a citation than can be achieved with any current models.

Keywords: knowledge representation, provenance architecture, ontology, metadata, bibliographic citation, semantic web annotation

Procedia PDF Downloads 118
873 Dynamic Ad-hoc Topologies for Mobile Robot Navigation Based on Non-Uniform Grid Maps

Authors: Peter Sauer, Thomas Hinze, Petra Hofstedt

Abstract:

To avoid obstacles in the surrounding environment and to navigate to a given target belong to the most important tasks for mobile robots. According to these tasks different data structures are suitable. To avoid near obstacles, occupancy grid maps are an ideal representation of the surroundings. For less fine grained tasks, such as navigating from one room to another in an apartment, pure grid maps are inappropriate. Grid maps are very detailed, calculating paths to navigate between rooms based on grid maps would take too long. Instead, graph-based data structures, so-called topologies, turn out to be a proper choice for such tasks. In this paper we present two methods to dynamically create topologies from grid maps. Both methods are based on non-uniform grid maps. The topologies are generated on-the-fly and can easily be modified to represent changes in the environment. This allows a hybrid approach to control mobile robots, where, depending on the situation and the current task, either the grid map or the generated topology may be used.

Keywords: robot navigation, occupancy grids, topological maps, dynamic map creation

Procedia PDF Downloads 563
872 DocPro: A Framework for Processing Semantic and Layout Information in Business Documents

Authors: Ming-Jen Huang, Chun-Fang Huang, Chiching Wei

Abstract:

With the recent advance of the deep neural network, we observe new applications of NLP (natural language processing) and CV (computer vision) powered by deep neural networks for processing business documents. However, creating a real-world document processing system needs to integrate several NLP and CV tasks, rather than treating them separately. There is a need to have a unified approach for processing documents containing textual and graphical elements with rich formats, diverse layout arrangement, and distinct semantics. In this paper, a framework that fulfills this unified approach is presented. The framework includes a representation model definition for holding the information generated by various tasks and specifications defining the coordination between these tasks. The framework is a blueprint for building a system that can process documents with rich formats, styles, and multiple types of elements. The flexible and lightweight design of the framework can help build a system for diverse business scenarios, such as contract monitoring and reviewing.

Keywords: document processing, framework, formal definition, machine learning

Procedia PDF Downloads 219
871 Intelligent Process Data Mining for Monitoring for Fault-Free Operation of Industrial Processes

Authors: Hyun-Woo Cho

Abstract:

The real-time fault monitoring and diagnosis of large scale production processes is helpful and necessary in order to operate industrial process safely and efficiently producing good final product quality. Unusual and abnormal events of the process may have a serious impact on the process such as malfunctions or breakdowns. This work try to utilize process measurement data obtained in an on-line basis for the safe and some fault-free operation of industrial processes. To this end, this work evaluated the proposed intelligent process data monitoring framework based on a simulation process. The monitoring scheme extracts the fault pattern in the reduced space for the reliable data representation. Moreover, this work shows the results of using linear and nonlinear techniques for the monitoring purpose. It has shown that the nonlinear technique produced more reliable monitoring results and outperforms linear methods. The adoption of the qualitative monitoring model helps to reduce the sensitivity of the fault pattern to noise.

Keywords: process data, data mining, process operation, real-time monitoring

Procedia PDF Downloads 641
870 Identification of Shocks from Unconventional Monetary Policy Measures

Authors: Margarita Grushanina

Abstract:

After several prominent central banks including European Central Bank (ECB), Federal Reserve System (Fed), Bank of Japan and Bank of England employed unconventional monetary policies in the aftermath of the financial crisis of 2008-2009 the problem of identification of the effects from such policies became of great interest. One of the main difficulties in identification of shocks from unconventional monetary policy measures in structural VAR analysis is that they often are anticipated, which leads to a non-fundamental MA representation of the VAR model. Moreover, the unconventional monetary policy actions may indirectly transmit to markets information about the future stance of the interest rate, which raises a question of the plausibility of the assumption of orthogonality between shocks from unconventional and conventional policy measures. This paper offers a method of identification that takes into account the abovementioned issues. The author uses factor-augmented VARs to increase the information set and identification through heteroskedasticity of error terms and rank restrictions on the errors’ second moments’ matrix to deal with the cross-correlation of the structural shocks.

Keywords: factor-augmented VARs, identification through heteroskedasticity, monetary policy, structural VARs

Procedia PDF Downloads 348
869 Mechanistic Modelling to De-risk Process Scale-up

Authors: Edwin Cartledge, Jack Clark, Mazaher Molaei-Chalchooghi

Abstract:

The mixing in the crystallization step of active pharmaceutical ingredient manufacturers was studied via advanced modeling tools to enable a successful scale-up. A virtual representation of the vessel was created, and computational fluid dynamics were used to simulate multiphase flow and, thus, the mixing environment within this vessel. The study identified a significant dead zone in the vessel underneath the impeller and found that increasing the impeller speed and power did not improve the mixing. A series of sensitivity analyses found that to improve mixing, the vessel had to be redesigned, and found that optimal mixing could be obtained by adding two extra cylindrical baffles. The same two baffles from the simulated environment were then constructed and added to the process vessel. By identifying these potential issues before starting the manufacture and modifying the vessel to ensure good mixing, this study mitigated a failed crystallization and potential batch disposal, which could have resulted in a significant loss of high-value material.

Keywords: active pharmaceutical ingredient, baffles, computational fluid dynamics, mixing, modelling

Procedia PDF Downloads 98
868 Quasi–Periodicity of Tonic Intervals in Octave and Innovation of Themes in Music Compositions

Authors: R. C. Tyagi

Abstract:

Quasi-periodicity of frequency intervals observed in Shruti based Absolute Scale of Music has been used to graphically identify the Anchor notes ‘Vadi’ and ‘Samvadi’ which are nodal points for expansion, elaboration and iteration of the emotional theme represented by the characteristic tonic arrangement in Raga compositions. This analysis leads to defining the Tonic parameters in the octave including the key-note frequency, tonic intervals’ anchor notes and the on-set and range of quasi-periodicities as exponents of 2. Such uniformity of representation of characteristic data would facilitate computational analysis and synthesis of music compositions and also help develop noise suppression techniques. Criteria for tuning of strings for compatibility with placement of frets on finger boards is discussed. Natural Rhythmic cycles in music compositions are analytically shown to lie between 3 and 126 beats.

Keywords: absolute scale, anchor notes, computational analysis, frets, innovation, noise suppression, Quasi-periodicity, rhythmic cycle, tonic interval, Shruti

Procedia PDF Downloads 305
867 Left Ventricular Adaptations of Elite Volleyball Players Based on the Playing Position

Authors: Shihab Aldin Al Riyami, Khosrow Ebrahim, Sajad Ahmadizad

Abstract:

Hemodynamic changes and ventricular loading during exercise lead to left ventricular (LV) hypertrophy. In athletes, volume load induces enlargement of the LV internal diameter and a proportional increase of wall thickness; while, pressure load would induce thickening of the ventricular wall. These adaptations are not similar in all athletes and are related to the types of sport. Volleyball players have different types of activity and roles based on their playing. Therefore, their physiological adaptations and requirements are different. The aim of the current study was to investigate the LV adaptationsinelite volleyball players based on their playing position. Sixty male elite volleyball players (age, 30.55±3.64 years)from Brazil, Serbia, Poland, Iran, Colombia, Cameroon, Japan, Egypt, Qatar, and Tunisia were investigated (from all five volleyball play positions). All participants had the experience of at least 3 years of participation at a professional level and international tournaments. LV characteristics were evaluated and measured using the echocardiography technique. Statistical analyses revealed significant differences (P<0.05)among the five groups of players forLV internal dimension (LVID), posterior wall thickness (PWT), and intact ventricular septum (IVS). Post-hoc analysis showed that opposite position players had significant higher value of LVID, PWT, and IVS when compared with other players, including outside hitter, middle blocker, setter, and libero (p<0.05). Additionally, in libero players, PWT was significantly lower when compared with other players (p<0.05). Based on the findings of the present study, it is concluded that LV adaptations in volleyball players are related to their playing position and that the opposite players had the highest LV adaptations when compared to other positions.

Keywords: athletes, cardiac adaptations, echocardio graphy, heart, sport

Procedia PDF Downloads 275
866 Comparative Study of Different Enhancement Techniques for Computed Tomography Images

Authors: C. G. Jinimole, A. Harsha

Abstract:

One of the key problems facing in the analysis of Computed Tomography (CT) images is the poor contrast of the images. Image enhancement can be used to improve the visual clarity and quality of the images or to provide a better transformation representation for further processing. Contrast enhancement of images is one of the acceptable methods used for image enhancement in various applications in the medical field. This will be helpful to visualize and extract details of brain infarctions, tumors, and cancers from the CT image. This paper presents a comparison study of five contrast enhancement techniques suitable for the contrast enhancement of CT images. The types of techniques include Power Law Transformation, Logarithmic Transformation, Histogram Equalization, Contrast Stretching, and Laplacian Transformation. All these techniques are compared with each other to find out which enhancement provides better contrast of CT image. For the comparison of the techniques, the parameters Peak Signal to Noise Ratio (PSNR) and Mean Square Error (MSE) are used. Logarithmic Transformation provided the clearer and best quality image compared to all other techniques studied and has got the highest value of PSNR. Comparison concludes with better approach for its future research especially for mapping abnormalities from CT images resulting from Brain Injuries.

Keywords: computed tomography, enhancement techniques, increasing contrast, PSNR and MSE

Procedia PDF Downloads 315
865 Assessing and Identifying Factors Affecting Customers Satisfaction of Commercial Bank of Ethiopia: The Case of West Shoa Zone (Bako, Gedo, Ambo, Ginchi and Holeta), Ethiopia

Authors: Habte Tadesse Likassa, Bacha Edosa

Abstract:

Customer’s satisfaction was very important thing that is required for the existence of banks to be more productive and success in any organization and business area. The main goal of the study is assessing and identifying factors that influence customer’s satisfaction in West Shoa Zone of Commercial Bank of Ethiopia (Holeta, Ginchi, Ambo, Gedo and Bako). Stratified random sampling procedure was used in the study and by using simple random sampling (lottery method) 520 customers were drawn from the target population. By using Probability Proportional Size Techniques sample size for each branch of banks were allocated. Both descriptive and inferential statistics methods were used in the study. A binary logistic regression model was fitted to see the significance of factors affecting customer’s satisfaction in this study. SPSS statistical package was used for data analysis. The result of the study reveals that the overall level of customer’s satisfaction in the study area is low (38.85%) as compared those who were not satisfied (61.15%). The result of study showed that all most all factors included in the study were significantly associated with customer’s satisfaction. Therefore, it can be concluded that based on the comparison of branches on their customers satisfaction by using odd ratio customers who were using Ambo and Bako are less satisfied as compared to customers who were in Holeta branch. Additionally, customers who were in Ginchi and Gedo were more satisfied than that of customers who were in Holeta. Since the level of customers satisfaction was low in the study area, it is more advisable and recommended for concerned body works cooperatively more in maximizing satisfaction of their customers.

Keywords: customers, satisfaction, binary logistic, complain handling process, waiting time

Procedia PDF Downloads 466
864 Cost-Benefit Analysis for the Optimization of Noise Abatement Treatments at the Workplace

Authors: Paolo Lenzuni

Abstract:

Cost-effectiveness of noise abatement treatments at the workplace has not yet received adequate consideration. Furthermore, most of the published work is focused on productivity, despite the poor correlation of this quantity with noise levels. There is currently no tool to estimate the social benefit associated to a specific noise abatement treatment, and no comparison among different options is accordingly possible. In this paper, we present an algorithm which has been developed to predict the cost-effectiveness of any planned noise control treatment in a workplace. This algorithm is based the estimates of hearing threshold shifts included in ISO 1999, and on compensations that workers are entitled to once their work-related hearing impairments have been certified. The benefits of a noise abatement treatment are estimated by means of the lower compensation costs which are paid to the impaired workers. Although such benefits have no real meaning in strictly monetary terms, they allow a reliable comparison between different treatments, since actual social costs can be assumed to be proportional to compensation costs. The existing European legislation on occupational exposure to noise it mandates that the noise exposure level be reduced below the upper action limit (85 dBA). There is accordingly little or no motivation for employers to sustain the extra costs required to lower the noise exposure below the lower action limit (80 dBA). In order to make this goal more appealing for employers, the algorithm proposed in this work also includes an ad-hoc element that promotes actions which bring the noise exposure down below 80 dBA. The algorithm has a twofold potential: 1) it can be used as a quality index to promote cost-effective practices; 2) it can be added to the existing criteria used by workers’ compensation authorities to evaluate the cost-effectiveness of technical actions, and support dedicated employers.

Keywords: cost-effectiveness, noise, occupational exposure, treatment

Procedia PDF Downloads 324
863 Transfer Learning for Protein Structure Classification at Low Resolution

Authors: Alexander Hudson, Shaogang Gong

Abstract:

Structure determination is key to understanding protein function at a molecular level. Whilst significant advances have been made in predicting structure and function from amino acid sequence, researchers must still rely on expensive, time-consuming analytical methods to visualise detailed protein conformation. In this study, we demonstrate that it is possible to make accurate (≥80%) predictions of protein class and architecture from structures determined at low (>3A) resolution, using a deep convolutional neural network trained on high-resolution (≤3A) structures represented as 2D matrices. Thus, we provide proof of concept for high-speed, low-cost protein structure classification at low resolution, and a basis for extension to prediction of function. We investigate the impact of the input representation on classification performance, showing that side-chain information may not be necessary for fine-grained structure predictions. Finally, we confirm that high resolution, low-resolution and NMR-determined structures inhabit a common feature space, and thus provide a theoretical foundation for boosting with single-image super-resolution.

Keywords: transfer learning, protein distance maps, protein structure classification, neural networks

Procedia PDF Downloads 138
862 Clarifier Dialogue Interface to resolve linguistic ambiguities in E-Learning Environment

Authors: Dalila Souilem, Salma Boumiza, Abdelkarim Abdelkader

Abstract:

The Clarifier Dialogue Interface (CDI) is a part of an online teaching system based on human-machine communication in learning situation. This interface used in the system during the learning action specifically in the evaluation step, to clarify ambiguities in the learner's response. The CDI can generate patterns allowing access to an information system, using the selectors associated with lexical units. To instantiate these patterns, the user request (especially learner’s response), must be analyzed and interpreted to deduce the canonical form, the semantic form and the subject of the sentence. For the efficiency of this interface at the interpretation level, a set of substitution operators is carried out in order to extend the possibilities of manipulation with a natural language. A second approach that will be presented in this paper focuses on the object languages with new prospects such as combination of natural language with techniques of handling information system in the area of online education. So all operators, the CDI and other interfaces associated to the domain expertise and teaching strategies will be unified using FRAME representation form.

Keywords: dialogue, e-learning, FRAME, information system, natural language

Procedia PDF Downloads 381
861 Improvement of the 3D Finite Element Analysis of High Voltage Power Transformer Defects in Time Domain

Authors: M. Rashid Hussain, Shady S. Refaat

Abstract:

The high voltage power transformer is the most essential part of the electrical power utilities. Reliability on the transformers is the utmost concern, and any failure of the transformers can lead to catastrophic losses in electric power utility. The causes of transformer failure include insulation failure by partial discharge, core and tank failure, cooling unit failure, current transformer failure, etc. For the study of power transformer defects, finite element analysis (FEA) can provide valuable information on the severity of defects. FEA provides a more accurate representation of complex geometries because they consider thermal, electrical, and environmental influences on the insulation models to obtain basic characteristics of the insulation system during normal and partial discharge conditions. The purpose of this paper is the time domain analysis of defects 3D model of high voltage power transformer using FEA to study the electric field distribution at different points on the defects.

Keywords: power transformer, finite element analysis, dielectric response, partial discharge, insulation

Procedia PDF Downloads 158
860 Crystallinity, Antimicrobial Activity and Dyeing Properties of Chitosan-G-Poly(N-Acryloyl Morpholine) Copolymer

Authors: Fakhreia A. Al Sagheer, Enas I. Ibrahim, Khaled D. Khalil

Abstract:

N-Acryloyl morpholine, NAM, was grafted onto chitosan utilizing homogeneous conditions with 1% acetic acid as the solvent, and potassium persulfate and sodium sulfite as the redox initiator. The effects of various reaction parameters, such as time, temperature, and monomer and initiator concentrations, on the percentage of grafting (G%) and the grafting efficiency (E%) were determined. The graft copolymer showed a remarkably improved crystallinity, as compared to the unmodified chitosan, based on the FESEM, XRD, and DSC results. Chitosan-g-poly(N-acryloyl morpholine) (Cs-PNAM), the copolymer obtained by using this procedure, was characterized by utilizing FTIR, FESEM, TGA, and XRD analysis. As expected, the results of an evaluation of antibacterial and antifungal activities show that the grafted chitosan copolymers exhibit stronger inhibitory effects against both types of microbes than does chitosan. Moreover, the size of the inhibition zone created by the graft copolymer was observed to be proportional to its G% corresponding to its morpholine content. Fortunately, the graft copolymer showed a marked growth inhibition against candidiasis (C.Albicans and C.Kefyr). We conclude that the graft copolymer may be highly effective in the prevention and treatment of candidiasis. In addition, the extent and pH dependence of uptake of different types of dyes (acidic: EBT, and MV; and basic: MB) by grafted chitosan in pH 6.5 aqueous solutions was determined. The results show that, the grafted copolymer exhibited a greater affinity to absorb the acid dyes more than the basic ones especially at relatively low temperature. Thus the modified chitosan can be used, in wastewater treatment, as efficient economic absorbent especially for anionic dyes from the industrial processing effluents.

Keywords: chitosan, N-Acryloyl morpholine, homogeneous grafting, antimicrobial activity, dye uptake

Procedia PDF Downloads 372
859 Mechanical Properties and Microstructural Analyzes of Epoxy Resins Reinforced with Satin Tissue

Authors: Băilă Diana Irinel, Păcurar Răzvan, Păcurar Ancuța

Abstract:

Although the volumes of fibre reinforced polymer composites (FRPs) used for aircraft applications is a relatively small percentage of total use, the materials often find their most sophisticated applications in this industry. In aerospace, the performance criteria placed upon materials can be far greater than in other areas – key aspects are light-weight, high-strength, high-stiffness, and good fatigue resistance. Composites were first used by the military before the technology was applied to commercial planes. Nowadays, composites are widely used, and this has been the result of a gradual direct substitution of metal components followed by the development of integrated composite designs as confidence in FRPs has increased. The airplane uses a range of components made from composites, including the fin and tailplane. In the last years, composite materials are increasingly used in automotive applications due to the improvement of material properties. In the aerospace and automotive sector, the fuel consumption is proportional to the weight of the body of the vehicle. A minimum of 20% of the cost can be saved if it used polymer composites in place of the metal structures and the operating and maintenance costs are alco very low. Glass fiber-epoxy composites are widely used in the making of aircraft and automobile body parts and are not only limited to these fields but also used in ship building, structural applications in civil engineering, pipes for the transport of liquids, electrical insulators in reactors. This article was establish the high-performance of composite material, a type glass-epoxy used in automotive and aeronautic domains, concerning the tensile and flexural tests and SEM analyzes.

Keywords: glass-epoxy composite, traction and flexion tests, SEM analysis, acoustic emission (AE) signals

Procedia PDF Downloads 103
858 Comparative Study of Calcium Content on in vitro Biological and Antibacterial Properties of Silicon-Based Bioglass

Authors: Morteza Elsa, Amirhossein Moghanian

Abstract:

The major aim of this study was to evaluate the effect of CaO content on in vitro hydroxyapatite formation, MC3T3 cells cytotoxicity and proliferation as well as antibacterial efficiency of sol-gel derived SiO2–CaO–P2O5 ternary system. For this purpose, first two grades of bioactive glass (BG); BG-58s (mol%: 60%SiO2–36%CaO–4%P2O5) and BG-68s (mol%: 70%SiO2–26%CaO–4%P2O5)) were synthesized by sol-gel method. Second, the effect of CaO content in their composition on in vitro bioactivity was investigated by soaking the BG-58s and BG-68s powders in simulated body fluid (SBF) for time periods up to 14 days and followed by characterization inductively coupled plasma atomic emission spectrometry (ICP-AES), Fourier transform infrared spectroscopy (FTIR), X-ray diffraction (XRD), and scanning electron microscopy (SEM) techniques. Additionally, live/dead staining, 3-(4,5dimethylthiazol-2-yl)-2,5-diphenyltetrazolium bromide (MTT), and alkaline phosphatase (ALP) activity assays were conducted respectively, as qualitatively and quantitatively assess for cell viability, proliferation and differentiations of MC3T3 cells in presence of 58s and 68s BGs. Results showed that BG-58s with higher CaO content showed higher in vitro bioactivity with respect to BG-68s. Moreover, the dissolution rate was inversely proportional to oxygen density of the BG. Live/dead assay revealed that both 58s and 68s increased the mean number live cells which were in good accordance with MTT assay. Furthermore, BG-58s showed more potential antibacterial activity against methicillin-resistant Staphylococcus aureus (MRSA) bacteria. Taken together, BG-58s with enhanced MC3T3 cells proliferation and ALP activity, acceptable bioactivity and significant high antibacterial effect against MRSA bacteria is suggested as a suitable candidate in order to further functionalizing for delivery of therapeutic ions and growth factors in bone tissue engineering.

Keywords: antibacterial, bioactive glass, hydroxyapatite, proliferation, sol-gel processes

Procedia PDF Downloads 148
857 Medical Knowledge Management since the Integration of Heterogeneous Data until the Knowledge Exploitation in a Decision-Making System

Authors: Nadjat Zerf Boudjettou, Fahima Nader, Rachid Chalal

Abstract:

Knowledge management is to acquire and represent knowledge relevant to a domain, a task or a specific organization in order to facilitate access, reuse and evolution. This usually means building, maintaining and evolving an explicit representation of knowledge. The next step is to provide access to that knowledge, that is to say, the spread in order to enable effective use. Knowledge management in the medical field aims to improve the performance of the medical organization by allowing individuals in the care facility (doctors, nurses, paramedics, etc.) to capture, share and apply collective knowledge in order to make optimal decisions in real time. In this paper, we propose a knowledge management approach based on integration technique of heterogeneous data in the medical field by creating a data warehouse, a technique of extracting knowledge from medical data by choosing a technique of data mining, and finally an exploitation technique of that knowledge in a case-based reasoning system.

Keywords: data warehouse, data mining, knowledge discovery in database, KDD, medical knowledge management, Bayesian networks

Procedia PDF Downloads 396
856 Simulation of 1D Dielectric Barrier Discharge in Argon Mixtures

Authors: Lucas Wilman Crispim, Patrícia Hallack, Maikel Ballester

Abstract:

This work aims at modeling electric discharges in gas mixtures. The mathematical model mimics the ignition process in a commercial spark-plug when a high voltage is applied to the plug terminals. A longitudinal unidimensional Cartesian domain is chosen for the simulation region. Energy and mass transfer are considered for a macroscopic fluid representation, while energy transfer in molecular collisions and chemical reactions are contemplated at microscopic level. The macroscopic model is represented by a set of uncoupled partial differential equations. Microscopic effects are studied within a discrete model for electronic and molecular collisions in the frame of ZDPlasKin, a plasma modeling numerical tool. The BOLSIG+ solver is employed in solving the electronic Boltzmann equation. An operator splitting technique is used to separate microscopic and macroscopic models. The simulation gas is a mixture of atomic Argon neutral, excited and ionized. Spatial and temporal evolution of such species and temperature are presented and discussed.

Keywords: CFD, electronic discharge, ignition, spark plug

Procedia PDF Downloads 162