Search results for: Empirical techniques.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3179

Search results for: Empirical techniques.

179 Novel Adaptive Channel Equalization Algorithms by Statistical Sampling

Authors: János Levendovszky, András Oláh

Abstract:

In this paper, novel statistical sampling based equalization techniques and CNN based detection are proposed to increase the spectral efficiency of multiuser communication systems over fading channels. Multiuser communication combined with selective fading can result in interferences which severely deteriorate the quality of service in wireless data transmission (e.g. CDMA in mobile communication). The paper introduces new equalization methods to combat interferences by minimizing the Bit Error Rate (BER) as a function of the equalizer coefficients. This provides higher performance than the traditional Minimum Mean Square Error equalization. Since the calculation of BER as a function of the equalizer coefficients is of exponential complexity, statistical sampling methods are proposed to approximate the gradient which yields fast equalization and superior performance to the traditional algorithms. Efficient estimation of the gradient is achieved by using stratified sampling and the Li-Silvester bounds. A simple mechanism is derived to identify the dominant samples in real-time, for the sake of efficient estimation. The equalizer weights are adapted recursively by minimizing the estimated BER. The near-optimal performance of the new algorithms is also demonstrated by extensive simulations. The paper has also developed a (Cellular Neural Network) CNN based approach to detection. In this case fast quadratic optimization has been carried out by t, whereas the task of equalizer is to ensure the required template structure (sparseness) for the CNN. The performance of the method has also been analyzed by simulations.

Keywords: Cellular Neural Network, channel equalization, communication over fading channels, multiuser communication, spectral efficiency, statistical sampling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1520
178 Security Analysis of Password Hardened Multimodal Biometric Fuzzy Vault

Authors: V. S. Meenakshi, G. Padmavathi

Abstract:

Biometric techniques are gaining importance for personal authentication and identification as compared to the traditional authentication methods. Biometric templates are vulnerable to variety of attacks due to their inherent nature. When a person-s biometric is compromised his identity is lost. In contrast to password, biometric is not revocable. Therefore, providing security to the stored biometric template is very crucial. Crypto biometric systems are authentication systems, which blends the idea of cryptography and biometrics. Fuzzy vault is a proven crypto biometric construct which is used to secure the biometric templates. However fuzzy vault suffer from certain limitations like nonrevocability, cross matching. Security of the fuzzy vault is affected by the non-uniform nature of the biometric data. Fuzzy vault when hardened with password overcomes these limitations. Password provides an additional layer of security and enhances user privacy. Retina has certain advantages over other biometric traits. Retinal scans are used in high-end security applications like access control to areas or rooms in military installations, power plants, and other high risk security areas. This work applies the idea of fuzzy vault for retinal biometric template. Multimodal biometric system performance is well compared to single modal biometric systems. The proposed multi modal biometric fuzzy vault includes combined feature points from retina and fingerprint. The combined vault is hardened with user password for achieving high level of security. The security of the combined vault is measured using min-entropy. The proposed password hardened multi biometric fuzzy vault is robust towards stored biometric template attacks.

Keywords: Biometric Template Security, Crypto Biometric Systems, Hardening Fuzzy Vault, Min-Entropy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2159
177 The Shifting Urban Role of Buildings’ Facades: A Diachronic Analysis of El Korba

Authors: Virginia Bassily, Sherif Goubran

Abstract:

In heritage conservation and revival, much of the focus is placed on the techniques and methods to preserve, restore, and revive heritage structures and locations. However, more attention needs to be drawn to how deterioration happens and its effect on the area’s character and socio-economic status. To this end, this research aims to examine the decline and its effect in the El Korba area in Heliopolis, Cairo, Egypt. El Korba was designed with a unique architectural character to stimulate social and economic life. However, the area has been on a path of physical deterioration that is corroding the social life on its streets. This research uses diachronic analysis in Ibrahim El-Lakkani Boulevard of El Korba based on a previously developed framework that connects buildings’ architectural features to the degree of social interaction in the street to document the changes that the building deterioration could have caused. Architectural features of the street level during both the original state (1906) and the current state (2021) are broken down and categorized in those six parameters to understand their decline or improvement over time. We find that the parameters that have decreased over the years and caused the deterioration are complexity and architectural character, permeability, territoriality and personalization, and physical comfort.  Based on these findings, revival projects can focus on physical parameters that create synergistic benefits by preserving and renewing heritage locations and revitalizing their socio-economic potential.

Keywords: Architectural character, heritage building conservation, enclosure, ground-floor use, El Korba, visual and physical permeability, personalization, physical comfort, social life, territoriality.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 488
176 Advanced Stochastic Models for Partially Developed Speckle

Authors: Jihad S. Daba (Jean-Pierre Dubois), Philip Jreije

Abstract:

Speckled images arise when coherent microwave, optical, and acoustic imaging techniques are used to image an object, surface or scene. Examples of coherent imaging systems include synthetic aperture radar, laser imaging systems, imaging sonar systems, and medical ultrasound systems. Speckle noise is a form of object or target induced noise that results when the surface of the object is Rayleigh rough compared to the wavelength of the illuminating radiation. Detection and estimation in images corrupted by speckle noise is complicated by the nature of the noise and is not as straightforward as detection and estimation in additive noise. In this work, we derive stochastic models for speckle noise, with an emphasis on speckle as it arises in medical ultrasound images. The motivation for this work is the problem of segmentation and tissue classification using ultrasound imaging. Modeling of speckle in this context involves partially developed speckle model where an underlying Poisson point process modulates a Gram-Charlier series of Laguerre weighted exponential functions, resulting in a doubly stochastic filtered Poisson point process. The statistical distribution of partially developed speckle is derived in a closed canonical form. It is observed that as the mean number of scatterers in a resolution cell is increased, the probability density function approaches an exponential distribution. This is consistent with fully developed speckle noise as demonstrated by the Central Limit theorem.

Keywords: Doubly stochastic filtered process, Poisson point process, segmentation, speckle, ultrasound

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1744
175 Effect of Laser Power and Powder Flow Rate on Properties of Laser Metal Deposited Ti6Al4V

Authors: Mukul Shukla, Rasheedat M. Mahamood, Esther T. Akinlabi, Sisa. Pityana

Abstract:

Laser Metal Deposition (LMD) is an additive manufacturing process with capabilities that include: producing new part directly from 3 Dimensional Computer Aided Design (3D CAD) model, building new part on the existing old component and repairing an existing high valued component parts that would have been discarded in the past. With all these capabilities and its advantages over other additive manufacturing techniques, the underlying physics of the LMD process is yet to be fully understood probably because of high interaction between the processing parameters and studying many parameters at the same time makes it further complex to understand. In this study, the effect of laser power and powder flow rate on physical properties (deposition height and deposition width), metallurgical property (microstructure) and mechanical (microhardness) properties on laser deposited most widely used aerospace alloy are studied. Also, because the Ti6Al4V is very expensive, and LMD is capable of reducing buy-to-fly ratio of aerospace parts, the material utilization efficiency is also studied. Four sets of experiments were performed and repeated to establish repeatability using laser power of 1.8 kW and 3.0 kW, powder flow rate of 2.88 g/min and 5.67 g/min, and keeping the gas flow rate and scanning speed constant at 2 l/min and 0.005 m/s respectively. The deposition height / width are found to increase with increase in laser power and increase in powder flow rate. The material utilization is favoured by higher power while higher powder flow rate reduces material utilization. The results are presented and fully discussed.

Keywords: Laser Metal Deposition, Material Efficiency, Microstructure, Ti6Al4V.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3629
174 Information Retrieval: Improving Question Answering Systems by Query Reformulation and Answer Validation

Authors: Mohammad Reza Kangavari, Samira Ghandchi, Manak Golpour

Abstract:

Question answering (QA) aims at retrieving precise information from a large collection of documents. Most of the Question Answering systems composed of three main modules: question processing, document processing and answer processing. Question processing module plays an important role in QA systems to reformulate questions. Moreover answer processing module is an emerging topic in QA systems, where these systems are often required to rank and validate candidate answers. These techniques aiming at finding short and precise answers are often based on the semantic relations and co-occurrence keywords. This paper discussed about a new model for question answering which improved two main modules, question processing and answer processing which both affect on the evaluation of the system operations. There are two important components which are the bases of the question processing. First component is question classification that specifies types of question and answer. Second one is reformulation which converts the user's question into an understandable question by QA system in a specific domain. The objective of an Answer Validation task is thus to judge the correctness of an answer returned by a QA system, according to the text snippet given to support it. For validating answers we apply candidate answer filtering, candidate answer ranking and also it has a final validation section by user voting. Also this paper described new architecture of question and answer processing modules with modeling, implementing and evaluating the system. The system differs from most question answering systems in its answer validation model. This module makes it more suitable to find exact answer. Results show that, from total 50 asked questions, evaluation of the model, show 92% improving the decision of the system.

Keywords: Answer processing, answer validation, classification, question answering, query reformulation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2847
173 Screen of MicroRNA Targets in Zebrafish Using Heterogeneous Data Sources: A Case Study for Dre-miR-10 and Dre-miR-196

Authors: Yanju Zhang, Joost M. Woltering, Fons J. Verbeek

Abstract:

It has been established that microRNAs (miRNAs) play an important role in gene expression by post-transcriptional regulation of messengerRNAs (mRNAs). However, the precise relationships between microRNAs and their target genes in sense of numbers, types and biological relevance remain largely unclear. Dissecting the miRNA-target relationships will render more insights for miRNA targets identification and validation therefore promote the understanding of miRNA function. In miRBase, miRanda is the key algorithm used for target prediction for Zebrafish. This algorithm is high-throughput but brings lots of false positives (noise). Since validation of a large scale of targets through laboratory experiments is very time consuming, several computational methods for miRNA targets validation should be developed. In this paper, we present an integrative method to investigate several aspects of the relationships between miRNAs and their targets with the final purpose of extracting high confident targets from miRanda predicted targets pool. This is achieved by using the techniques ranging from statistical tests to clustering and association rules. Our research focuses on Zebrafish. It was found that validated targets do not necessarily associate with the highest sequence matching. Besides, for some miRNA families, the frequency of their predicted targets is significantly higher in the genomic region nearby their own physical location. Finally, in a case study of dre-miR-10 and dre-miR-196, it was found that the predicted target genes hoxd13a, hoxd11a, hoxd10a and hoxc4a of dre-miR- 10 while hoxa9a, hoxc8a and hoxa13a of dre-miR-196 have similar characteristics as validated target genes and therefore represent high confidence target candidates.

Keywords: MicroRNA targets validation, microRNA-target relationships, dre-miR-10, dre-miR-196.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1991
172 Laplace Transformation on Ordered Linear Space of Generalized Functions

Authors: K. V. Geetha, N. R. Mangalambal

Abstract:

Aim. We have introduced the notion of order to multinormed spaces and countable union spaces and their duals. The topology of bounded convergence is assigned to the dual spaces. The aim of this paper is to develop the theory of ordered topological linear spaces La,b, L(w, z), the dual spaces of ordered multinormed spaces La,b, ordered countable union spaces L(w, z), with the topology of bounded convergence assigned to the dual spaces. We apply Laplace transformation to the ordered linear space of Laplace transformable generalized functions. We ultimately aim at finding solutions to nonhomogeneous nth order linear differential equations with constant coefficients in terms of generalized functions and comparing different solutions evolved out of different initial conditions. Method. The above aim is achieved by • Defining the spaces La,b, L(w, z). • Assigning an order relation on these spaces by identifying a positive cone on them and studying the properties of the cone. • Defining an order relation on the dual spaces La,b, L(w, z) of La,b, L(w, z) and assigning a topology to these dual spaces which makes the order dual and the topological dual the same. • Defining the adjoint of a continuous map on these spaces and studying its behaviour when the topology of bounded convergence is assigned to the dual spaces. • Applying the two-sided Laplace Transformation on the ordered linear space of generalized functions W and studying some properties of the transformation which are used in solving differential equations. Result. The above techniques are applied to solve non-homogeneous n-th order linear differential equations with constant coefficients in terms of generalized functions and to compare different solutions of the differential equation.

Keywords: Laplace transformable generalized function, positive cone, topology of bounded convergence

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1234
171 Direction to Manage OTOP Entrepreneurship Based on Local Wisdom

Authors: Witthaya Mekhum

Abstract:

The OTOP Entrepreneurship that used to create substantial source of income for local Thai communities are now in a stage of exigent matters that required assistances from public sectors due to over Entrepreneurship of duplicative ideas, unable to adjust costs and prices, lack of innovation, and inadequate of quality control. Moreover, there is a repetitive problem of middlemen who constantly corner the OTOP market. Local OTOP producers become easy preys since they do not know how to add more values, how to create and maintain their own brand name, and how to create proper packaging and labeling. The suggested solutions to local OTOP producers are to adopt modern management techniques, to find knowhow to add more values to products and to unravel other marketing problems. The objectives of this research are to study the prevalent OTOP products management and to discover direction to manage OTOP products to enhance the effectiveness of OTOP Entrepreneurship in Nonthaburi Province, Thailand. There were 113 participants in this study. The research tools can be divided into two parts: First part is done by questionnaire to find responses of the prevalent OTOP Entrepreneurship management. Second part is the use of focus group which is conducted to encapsulate ideas and local wisdom. Data analysis is performed by using frequency, percentage, mean, and standard deviation as well as the synthesis of several small group discussions. The findings reveal that 1) Business Resources: the quality of product is most important and the marketing of product is least important. 2) Business Management: Leadership is most important and raw material planning is least important. 3) Business Readiness: Communication is most important and packaging is least important. 4) Support from public sector: Certified from the government is most important and source of raw material is the least important.

Keywords: Management, OTOP Entrepreneurship, Local Wisdom

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1941
170 A Corporate Social Responsibility Project to Improve the Democratization of Scientific Education in Brazil

Authors: Denise Levy

Abstract:

Nuclear technology is part of our everyday life and its beneficial applications help to improve the quality of our lives. Nevertheless, in Brazil, most often the media and social networks tend to associate radiation to nuclear weapons and major accidents, and there is still great misunderstanding about the peaceful applications of nuclear science. The Educational Portal Radioatividades (Radioactivities) is a corporate social responsibility initiative that takes advantage of the growing impact of Internet to offer high quality scientific information for teachers and students throughout Brazil. This web-based initiative focusses on the positive applications of nuclear technology, presenting the several contributions of ionizing radiation in different contexts, such as nuclear medicine, agriculture techniques, food safety and electric power generation, proving nuclear technology as part of modern life and a must to improve the quality of our lifestyle. This educational project aims to contribute for democratization of scientific education and social inclusion, approaching society to scientific knowledge, promoting critical thinking and inspiring further reflections. The website offers a wide variety of ludic activities such as curiosities, interactive exercises and short courses. Moreover, teachers are offered free web-based material with full instructions to be developed in class. Since year 2013, the project has been developed and improved according to a comprehensive study about the realistic scenario of ICTs infrastructure in Brazilian schools and in full compliance with the best e-learning national and international recommendations.

Keywords: Information and communication technologies, nuclear technology, science communication, society and education.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1216
169 The Latency-Amplitude Binomial of Waves Resulting from the Application of Evoked Potentials for the Diagnosis of Dyscalculia

Authors: Maria Isabel Garcia-Planas, Maria Victoria Garcia-Camba

Abstract:

Recent advances in cognitive neuroscience have allowed a step forward in perceiving the processes involved in learning from the point of view of acquiring new information or the modification of existing mental content. The evoked potentials technique reveals how basic brain processes interact to achieve adequate and flexible behaviours. The objective of this work, using evoked potentials, is to study if it is possible to distinguish if a patient suffers a specific type of learning disorder to decide the possible therapies to follow. The methodology used in this work is to analyze the dynamics of different brain areas during a cognitive activity to find the relationships between the other areas analyzed to understand the functioning of neural networks better. Also, the latest advances in neuroscience have revealed the exis-tence of different brain activity in the learning process that can be highlighted through the use of non-invasive, innocuous, low-cost and easy-access techniques such as, among others, the evoked potentials that can help to detect early possible neurodevelopmental difficulties for their subsequent assessment and therapy. From the study of the amplitudes and latencies of the evoked potentials, it is possible to detect brain alterations in the learning process, specifically in dyscalculia, to achieve specific corrective measures for the application of personalized psycho-pedagogical plans that allow obtaining an optimal integral development of the affected people.

Keywords: dyscalculia, neurodevelopment, evoked potentials, learning disabilities, neural networks

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 599
168 Influence of Strengthening with Perforated Steel Plates on the Behavior of Infill Walls and RC Frame

Authors: Eray Ozbek, Ilker Kalkan, S. Oguzhan Akbas, Sabahattin Aykac

Abstract:

The contribution of the infill walls to the overall earthquake response of a structure is limited and this contribution is generally ignored in the analyses. Strengthening of the infill walls through different techniques has been and is being studied extensively in the literature to increase this limited contribution and the ductilities and energy absorption capacities of the infill walls to create non-structural components where the earthquake-induced energy can be absorbed without damaging the bearing components of the structural frame. The present paper summarizes an extensive research project dedicated to investigate the effects of strengthening the brick infill walls of a reinforced concrete (RC) frame on its lateral earthquake response. Perforated steel plates were used in strengthening due to several reasons, including the ductility and high deformation capacity of these plates, the fire resistant, recyclable and non-cancerogenic nature of mild steel, and the ease of installation and removal of the plates to the wall with the help of anchor bolts only. Furthermore, epoxy, which increases the cost and amount of labor of the strengthening process, is not needed in this technique. The individual behavior of the strengthened walls under monotonic diagonal and lateral reversed cyclic loading was investigated within the scope of the study. Upon achieving brilliant results, RC frames with strengthened infill walls were tested and are being tested to examine the influence of this strengthening technique on the overall behavior of the RC frames. Tests on the wall and frame specimens indicated that the perforated steel plates contribute to the lateral strength, rigidity, ductility and energy absorption capacity of the wall and the infilled frame to a major extent.

Keywords: Infill wall, Strengthening, External plate, Earthquake Behavior.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2420
167 Feature Point Reduction for Video Stabilization

Authors: Theerawat Songyot, Tham Manjing, Bunyarit Uyyanonvara, Chanjira Sinthanayothin

Abstract:

Corner detection and optical flow are common techniques for feature-based video stabilization. However, these algorithms are computationally expensive and should be performed at a reasonable rate. This paper presents an algorithm for discarding irrelevant feature points and maintaining them for future use so as to improve the computational cost. The algorithm starts by initializing a maintained set. The feature points in the maintained set are examined against its accuracy for modeling. Corner detection is required only when the feature points are insufficiently accurate for future modeling. Then, optical flows are computed from the maintained feature points toward the consecutive frame. After that, a motion model is estimated based on the simplified affine motion model and least square method, with outliers belonging to moving objects presented. Studentized residuals are used to eliminate such outliers. The model estimation and elimination processes repeat until no more outliers are identified. Finally, the entire algorithm repeats along the video sequence with the points remaining from the previous iteration used as the maintained set. As a practical application, an efficient video stabilization can be achieved by exploiting the computed motion models. Our study shows that the number of times corner detection needs to perform is greatly reduced, thus significantly improving the computational cost. Moreover, optical flow vectors are computed for only the maintained feature points, not for outliers, thus also reducing the computational cost. In addition, the feature points after reduction can sufficiently be used for background objects tracking as demonstrated in the simple video stabilizer based on our proposed algorithm.

Keywords: background object tracking, feature point reduction, low cost tracking, video stabilization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1767
166 Multiple Targets Classification and Fuzzy Logic Decision Fusion in Wireless Sensor Networks

Authors: Ahmad Aljaafreh

Abstract:

This paper proposes a hierarchical hidden Markov model (HHMM) to model the detection of M vehicles in a wireless sensor network (WSN). The HHMM model contains an extra level of hidden Markov model to model the temporal transitions of each state of the first HMM. By modeling the temporal transitions, only those hypothesis with nonzero transition probabilities needs to be tested. Thus, this method efficiently reduces the computation load, which is preferable in WSN applications.This paper integrates several techniques to optimize the detection performance. The output of the states of the first HMM is modeled as Gaussian Mixture Model (GMM), where the number of states and the number of Gaussians are experimentally determined, while the other parameters are estimated using Expectation Maximization (EM). HHMM is used to model the sequence of the local decisions which are based on multiple hypothesis testing with maximum likelihood approach. The states in the HHMM represent various combinations of vehicles of different types. Due to the statistical advantages of multisensor data fusion, we propose a heuristic based on fuzzy weighted majority voting to enhance cooperative classification of moving vehicles within a region that is monitored by a wireless sensor network. A fuzzy inference system weighs each local decision based on the signal to noise ratio of the acoustic signal for target detection and the signal to noise ratio of the radio signal for sensor communication. The spatial correlation among the observations of neighboring sensor nodes is efficiently utilized as well as the temporal correlation. Simulation results demonstrate the efficiency of this scheme.

Keywords: Classification, decision fusion, fuzzy logic, hidden Markov model

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6249
165 Stereo Motion Tracking

Authors: Yudhajit Datta, Jonathan Bandi, Ankit Sethia, Hamsi Iyer

Abstract:

Motion Tracking and Stereo Vision are complicated, albeit well-understood problems in computer vision. Existing softwares that combine the two approaches to perform stereo motion tracking typically employ complicated and computationally expensive procedures. The purpose of this study is to create a simple and effective solution capable of combining the two approaches. The study aims to explore a strategy to combine the two techniques of two-dimensional motion tracking using Kalman Filter; and depth detection of object using Stereo Vision. In conventional approaches objects in the scene of interest are observed using a single camera. However for Stereo Motion Tracking; the scene of interest is observed using video feeds from two calibrated cameras. Using two simultaneous measurements from the two cameras a calculation for the depth of the object from the plane containing the cameras is made. The approach attempts to capture the entire three-dimensional spatial information of each object at the scene and represent it through a software estimator object. In discrete intervals, the estimator tracks object motion in the plane parallel to plane containing cameras and updates the perpendicular distance value of the object from the plane containing the cameras as depth. The ability to efficiently track the motion of objects in three-dimensional space using a simplified approach could prove to be an indispensable tool in a variety of surveillance scenarios. The approach may find application from high security surveillance scenes such as premises of bank vaults, prisons or other detention facilities; to low cost applications in supermarkets and car parking lots.

Keywords: Kalman Filter, Stereo Vision, Motion Tracking, Matlab, Object Tracking, Camera Calibration, Computer Vision System Toolbox.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2822
164 Revisiting Domestication and Foreignisation Methods: Translating the Quran by the Hybrid Approach

Authors: Aladdin Al-Tarawneh

Abstract:

The Quran, as it is the sacred book of Islam and considered the literal word of God (Allah) in Arabic, is highly translated into many languages; however, the foreignising or the literal approach excessively stains the quality and discredits the final product in the eyes of its receptors. Such an approach fails to capture the intended meaning of the Quran and to communicate it in any language. Therefore, this study is conducted to propose a different approach that seeks involving other ones according to a hybrid model. Indeed, this study challenges the binary adherence that is highly used in Translation Studies (TS) in general and in the translation of the Quran in particular. Drawing on the genuine fact that the Quran can be communicated in any language in terms of meaning, and the translation is not sacred; this paper approaches the translation of the Quran by blending different methods like domestication or foreignisation in a systematic way, avoiding the binary choice made by many translators. To reach this aim, the paper has a conceptual part that seeks to elucidate and clarify the main methods employed in TS, and criticise and modify them to propose the new hybrid approach (the hybrid model) for translating the Quran – that is, the deductive method. To support and validate the outcome of the previous part, a comparative model is employed in order to highlight the differences between the suggested translation and other widely used ones – that is, the inductive method. By applying this methodology, the paper proves that there is a deficiency of communicating the original meaning of the Quran in light of the foreignising approach. In conclusion, the paper suggests producing a Quran translation has to take into account the adoption of many techniques to express the meaning of the Quran as understood in the original, and to offer this understanding in English in the most native-like manner to serve the intended target readers.

Keywords: Quran translation, hybrid approach, domestication, foreignisation, hybrid model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1189
163 Space Telemetry Anomaly Detection Based on Statistical PCA Algorithm

Authors: B. Nassar, W. Hussein, M. Mokhtar

Abstract:

The critical concern of satellite operations is to ensure the health and safety of satellites. The worst case in this perspective is probably the loss of a mission, but the more common interruption of satellite functionality can result in compromised mission objectives. All the data acquiring from the spacecraft are known as Telemetry (TM), which contains the wealth information related to the health of all its subsystems. Each single item of information is contained in a telemetry parameter, which represents a time-variant property (i.e. a status or a measurement) to be checked. As a consequence, there is a continuous improvement of TM monitoring systems to reduce the time required to respond to changes in a satellite's state of health. A fast conception of the current state of the satellite is thus very important to respond to occurring failures. Statistical multivariate latent techniques are one of the vital learning tools that are used to tackle the problem above coherently. Information extraction from such rich data sources using advanced statistical methodologies is a challenging task due to the massive volume of data. To solve this problem, in this paper, we present a proposed unsupervised learning algorithm based on Principle Component Analysis (PCA) technique. The algorithm is particularly applied on an actual remote sensing spacecraft. Data from the Attitude Determination and Control System (ADCS) was acquired under two operation conditions: normal and faulty states. The models were built and tested under these conditions, and the results show that the algorithm could successfully differentiate between these operations conditions. Furthermore, the algorithm provides competent information in prediction as well as adding more insight and physical interpretation to the ADCS operation.

Keywords: Space telemetry monitoring, multivariate analysis, PCA algorithm, space operations.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2062
162 Importance of Risk Assessment in Managers´ Decision-Making Process

Authors: Mária Hudáková, Vladimír Míka, Katarína Hollá

Abstract:

Making decisions is the core of management and a result of conscious activities which is under way in a particular environment and concrete conditions. The managers decide about the goals, procedures and about the methods how to respond to the changes and to the problems which developed. Their decisions affect the effectiveness, quality, economy and the overall successfulness in every organisation. In spite of this fact, they do not pay sufficient attention to the individual steps of the decision-making process. They emphasise more how to cope with the individual methods and techniques of making decisions and forget about the way how to cope with analysing the problem or assessing the individual solution variants. In many cases, the underestimating of the analytical phase can lead to an incorrect assessment of the problem and this can then negatively influence its further solution. Based on our analysis of the theoretical solutions by individual authors who are dealing with this area and the realised research in Slovakia and also abroad we can recognise an insufficient interest of the managers to assess the risks in the decision-making process. The goal of this paper is to assess the risks in the managers´ decision-making process relating to the conditions of the environment, to the subject’s activity (the manager’s personality), to the insufficient assessment of individual variants for solving the problems but also to situations when the arisen problem is not solved. The benefit of this paper is the effort to increase the need of the managers to deal with the risks during the decision-making process. It is important for every manager to assess the risks in his/her decision-making process and to make efforts to take such decisions which reflect the basic conditions, states and development of the environment in the best way and especially for the managers´ decisions to contribute to achieving the determined goals of the organisation as effectively as possible.

Keywords: Risk, decision-making, manager, process, analysis, source of risk.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1799
161 The Impact of HIV/AIDS on Micro-enterprise Development in Kenya: A Study of Obunga Slum in Kisumu

Authors: C. A. Oloo, C. Ojwang

Abstract:

The performances of small and medium enterprises have stagnated in the last two decades. This has mainly been due to the emergence of HIV / Aids. The disease has had a detrimental effect on the general economy of the country leading to morbidity and mortality of the Kenyan workforce in their primary age. The present study sought to establish the economic impact of HIV / Aids on the micro-enterprise development in Obunga slum – Kisumu, in terms of production loss, increasing labor related cost and to establish possible strategies to address the impact of HIV / Aids on microenterprises. The study was necessitated by the observation that most micro-enterprises in the slum are facing severe economic and social crisis due to the impact of HIV / Aids, they get depleted and close down within a short time due to death of skilled and experience workforce. The study was carried out between June 2008 and June 2009 in Obunga slum. Data was subjected to computer aided statistical analysis that included descriptive statistic, chi-squared and ANOVA techniques. Chi-squared analysis on the micro-enterprise owners opinion on the impact of HIV / Aids on depletion of microenterprise compared to other diseases indicated high levels of the negative effects of the disease at significance levels of P<0.01. Analysis of variance on the impact of HIV / Aids on the performance and productivity of micro-enterprises also indicated a negative effect on the general performance of micro-enterprise at significance levels of P<0.01. Therefore reducing the negative impacts of HIV/Aids on micro-enterprise development, there is need to improve the socioeconomic environment, mobilize donors and stake holders in training and funding, and review the current strategies for addressing the disease. Further conclusive research should also be conducted on a bigger scale.

Keywords: Entrepreneurship, HIV-AIDS, Micro-enterprise, Poverty.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2404
160 Simulating Human Behavior in (Un)Built Environments: Using an Actor Profiling Method

Authors: Hadas Sopher, Davide Schaumann, Yehuda E. Kalay

Abstract:

This paper addresses the shortcomings of architectural computation tools in representing human behavior in built environments, prior to construction and occupancy of those environments. Evaluating whether a design fits the needs of its future users is currently done solely post construction, or is based on the knowledge and intuition of the designer. This issue is of high importance when designing complex buildings such as hospitals, where the quality of treatment as well as patient and staff satisfaction are of major concern. Existing computational pre-occupancy human behavior evaluation methods are geared mainly to test ergonomic issues, such as wheelchair accessibility, emergency egress, etc. As such, they rely on Agent Based Modeling (ABM) techniques, which emphasize the individual user. Yet we know that most human activities are social, and involve a number of actors working together, which ABM methods cannot handle. Therefore, we present an event-based model that manages the interaction between multiple Actors, Spaces, and Activities, to describe dynamically how people use spaces. This approach requires expanding the computational representation of Actors beyond their physical description, to include psychological, social, cultural, and other parameters. The model presented in this paper includes cognitive abilities and rules that describe the response of actors to their physical and social surroundings, based on the actors’ internal status. The model has been applied in a simulation of hospital wards, and showed adaptability to a wide variety of situated behaviors and interactions.

Keywords: Agent based modeling, architectural design evaluation, event modeling, human behavior simulation, spatial cognition.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1072
159 Modeling and FOS Feedback Based Control of SISO Intelligent Structures with Embedded Shear Sensors and Actuators

Authors: T. C. Manjunath, B. Bandyopadhyay

Abstract:

Active vibration control is an important problem in structures. The objective of active vibration control is to reduce the vibrations of a system by automatic modification of the system-s structural response. In this paper, the modeling and design of a fast output sampling feedback controller for a smart flexible beam system embedded with shear sensors and actuators for SISO system using Timoshenko beam theory is proposed. FEM theory, Timoshenko beam theory and the state space techniques are used to model the aluminum cantilever beam. For the SISO case, the beam is divided into 5 finite elements and the control actuator is placed at finite element position 1, whereas the sensor is varied from position 2 to 5, i.e., from the nearby fixed end to the free end. Controllers are designed using FOS method and the performance of the designed FOS controller is evaluated for vibration control for 4 SISO models of the same plant. The effect of placing the sensor at different locations on the beam is observed and the performance of the controller is evaluated for vibration control. Some of the limitations of the Euler-Bernoulli theory such as the neglection of shear and axial displacement are being considered here, thus giving rise to an accurate beam model. Embedded shear sensors and actuators have been considered in this paper instead of the surface mounted sensors and actuators for vibration suppression because of lot of advantages. In controlling the vibration modes, the first three dominant modes of vibration of the system are considered.

Keywords: Smart structure, Timoshenko beam theory, Fast output sampling feedback control, Finite Element Method, State space model, SISO, Vibration control, LMI

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1787
158 The Pedagogical Integration of Digital Technologies in Initial Teacher Training

Authors: Vânia Graça, Paula Quadros-Flores, Altina Ramos

Abstract:

The use of Digital Technologies in teaching and learning processes is currently a reality, namely in initial teacher training. This study aims at knowing the digital reality of students in initial teacher training in order to improve training in the educational use of ICT and to promote digital technology integration strategies in an educational context. It is part of the IFITIC Project "Innovate with ICT in Initial Teacher Training to Promote Methodological Renewal in Pre-school Education and in the 1st and 2nd Basic Education Cycle" which involves the School of Education, Polytechnic of Porto and Institute of Education, University of Minho. The Project aims at rethinking educational practice with ICT in the initial training of future teachers in order to promote methodological innovation in Pre-school Education and in the 1st and 2nd Cycles of Basic Education. A qualitative methodology was used, in which a questionnaire survey was applied to teachers in initial training. For data analysis, the techniques of content analysis with the support of NVivo software were used. The results point to the following aspects: a) future teachers recognize that they have more technical knowledge about ICT than pedagogical knowledge. This result makes sense if we consider the objective of Basic Education, so that the gaps can be filled in the Master's Course by students who wish to follow the teaching; b) the respondents are aware that the integration of digital resources contributes positively to students' learning and to the life of children and young people, which also promotes preparation in life; c) to be a teacher in the digital age there is a need for the development of digital literacy, lifelong learning and the adoption of new ways of teaching how to learn. Thus, this study aims to contribute to a reflection on the teaching profession in the digital age.

Keywords: Digital technologies, initial teacher training, pedagogical use of ICT, skills.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 600
157 Design of a Hand-Held, Clamp-on, Leakage Current Sensor for High Voltage Direct Current Insulators

Authors: Morné Roman, Robert van Zyl, Nishanth Parus, Nishal Mahatho

Abstract:

Leakage current monitoring for high voltage transmission line insulators is of interest as a performance indicator. Presently, to the best of our knowledge, there is no commercially available, clamp-on type, non-intrusive device for measuring leakage current on energised high voltage direct current (HVDC) transmission line insulators. The South African power utility, Eskom, is investigating the development of such a hand-held sensor for two important applications; first, for continuous real-time condition monitoring of HVDC line insulators and, second, for use by live line workers to determine if it is safe to work on energised insulators. In this paper, a DC leakage current sensor based on magnetic field sensing techniques is developed. The magnetic field sensor used in the prototype can also detect alternating current up to 5 MHz. The DC leakage current prototype detects the magnetic field associated with the current flowing on the surface of the insulator. Preliminary HVDC leakage current measurements are performed on glass insulators. The results show that the prototype can accurately measure leakage current in the specified current range of 1-200 mA. The influence of external fields from the HVDC line itself on the leakage current measurements is mitigated through a differential magnetometer sensing technique. Thus, the developed sensor can perform measurements on in-service HVDC insulators. The research contributes to the body of knowledge by providing a sensor to measure leakage current on energised HVDC insulators non-intrusively. This sensor can also be used by live line workers to inform them whether or not it is safe to perform maintenance on energized insulators.

Keywords: Direct current, insulator, leakage current, live line, magnetic field, sensor, transmission lines.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 912
156 Attitudes of Gratitude: An Analysis of 30 Cancer Narratives Published by Leading U.S. Cancer Care Centers

Authors: Maria L. McLeod

Abstract:

This study examines the ways in which cancer patient narratives are portrayed and framed on the websites of three leading U.S. cancer care centers – The University of Texas MD Anderson Cancer Center in Houston, Memorial Sloan Kettering Cancer Center in New York, and Seattle Cancer Care Alliance. Thirty patient stories, 10 from each cancer center website blog, were analyzed using qualitative and quantitative textual analysis of unstructured data, documenting common themes and other elements of story structure and content. Patient narratives were coded using grounded theory as the basis for conducting emergent qualitative research. As part of a systematic, inductive approach to collecting and analyzing data, recurrent and unique themes were examined and compared in terms of positive and negative framing, patient agency, and institutional praise. All three of these cancer care centers are teaching hospitals, with university affiliations, that emphasize an evidence-based scientific approach to treatment that utilizes the latest research and cutting-edge techniques and technology. The featured cancer stories suggest positive outcomes based on anecdotal narratives as opposed to the science-based treatment models employed by the cancer centers. An analysis of 30 sample stories found skewed representation of the “cancer experience” that emphasizes positive outcomes while minimizing or excluding more negative realities of cancer diagnosis and treatment. The stories also deemphasize patient agency, instead focusing on deference and gratitude toward the cancer care centers, which are cast in the role of savior.  

Keywords: Cancer framing, cancer narratives, survivor stories, patient narratives.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 571
155 Torsion Behavior of Steel Fibered High Strength Self Compacting Concrete Beams Reinforced by GFRB Bars

Authors: Khaled S. Ragab, Ahmed S. Eisa

Abstract:

This paper investigates experimentally and analytically the torsion behavior of steel fibered high strength self compacting concrete beams reinforced by GFRP bars. Steel fibered high strength self compacting concrete (SFHSSCC) and GFRP bars became in the recent decades a very important materials in the structural engineering field. The use of GFRP bars to replace steel bars has emerged as one of the many techniques put forward to enhance the corrosion resistance of reinforced concrete structures. High strength concrete and GFRP bars attract designers and architects as it allows improving the durability as well as the esthetics of a construction. One of the trends in SFHSSCC structures is to provide their ductile behavior and additional goal is to limit development and propagation of macro-cracks in the body of SFHSSCC elements. SFHSSCC and GFRP bars are tough, improve the workability, enhance the corrosion resistance of reinforced concrete structures, and demonstrate high residual strengths after appearance of the first crack. Experimental studies were carried out to select effective fiber contents. Three types of volume fraction from hooked shape steel fibers are used in this study, the hooked steel fibers were evaluated in volume fractions ranging between 0.0%, 0.75% and 1.5%. The beams shape is chosen to create the required forces (i.e. torsion and bending moments simultaneously) on the test zone. A total of seven beams were tested, classified into three groups. All beams, have 200cm length, cross section of 10×20cm, longitudinal bottom reinforcement of 3

Keywords: Self compacting concrete, torsion behavior, steel fiber, steel fiber reinforced high strength self compacting concrete (SFRHSCC), GFRP bars.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3362
154 Analysis of Linked in Series Servers with Blocking, Priority Feedback Service and Threshold Policy

Authors: Walenty Oniszczuk

Abstract:

The use of buffer thresholds, blocking and adequate service strategies are well-known techniques for computer networks traffic congestion control. This motivates the study of series queues with blocking, feedback (service under Head of Line (HoL) priority discipline) and finite capacity buffers with thresholds. In this paper, the external traffic is modelled using the Poisson process and the service times have been modelled using the exponential distribution. We consider a three-station network with two finite buffers, for which a set of thresholds (tm1 and tm2) is defined. This computer network behaves as follows. A task, which finishes its service at station B, gets sent back to station A for re-processing with probability o. When the number of tasks in the second buffer exceeds a threshold tm2 and the number of task in the first buffer is less than tm1, the fed back task is served under HoL priority discipline. In opposite case, for fed backed tasks, “no two priority services in succession" procedure (preventing a possible overflow in the first buffer) is applied. Using an open Markovian queuing schema with blocking, priority feedback service and thresholds, a closed form cost-effective analytical solution is obtained. The model of servers linked in series is very accurate. It is derived directly from a twodimensional state graph and a set of steady-state equations, followed by calculations of main measures of effectiveness. Consequently, efficient expressions of the low computational cost are determined. Based on numerical experiments and collected results we conclude that the proposed model with blocking, feedback and thresholds can provide accurate performance estimates of linked in series networks.

Keywords: Blocking, Congestion control, Feedback, Markov chains, Performance evaluation, Threshold-base networks.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1294
153 Atomic Force Microscopy (AFM)Topographical Surface Characterization of Multilayer-Coated and Uncoated Carbide Inserts

Authors: Samy E. Oraby, Ayman M. Alaskari

Abstract:

In recent years, scanning probe atomic force microscopy SPM AFM has gained acceptance over a wide spectrum of research and science applications. Most fields focuses on physical, chemical, biological while less attention is devoted to manufacturing and machining aspects. The purpose of the current study is to assess the possible implementation of the SPM AFM features and its NanoScope software in general machining applications with special attention to the tribological aspects of cutting tool. The surface morphology of coated and uncoated as-received carbide inserts is examined, analyzed, and characterized through the determination of the appropriate scanning setting, the suitable data type imaging techniques and the most representative data analysis parameters using the MultiMode SPM AFM in contact mode. The NanoScope operating software is used to capture realtime three data types images: “Height", “Deflection" and “Friction". Three scan sizes are independently performed: 2, 6, and 12 μm with a 2.5 μm vertical range (Z). Offline mode analysis includes the determination of three functional topographical parameters: surface “Roughness", power spectral density “PSD" and “Section". The 12 μm scan size in association with “Height" imaging is found efficient to capture every tiny features and tribological aspects of the examined surface. Also, “Friction" analysis is found to produce a comprehensive explanation about the lateral characteristics of the scanned surface. Configuration of many surface defects and drawbacks has been precisely detected and analyzed.

Keywords: SPM AFM contact mode, carbide inserts, scan size, surface defects, surface roughness, PSD.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7270
152 Evaluation of the Discoloration of Methyl Orange Using Black Sand as Semiconductor through Photocatalytic Oxidation and Reduction

Authors: P. Acosta-Santamaría, A. Ibatá-Soto, A. López-Vásquez

Abstract:

Organic compounds in wastewaters coming from textile and pharmaceutical industry generated multiple harmful effects on the environment and the human health. One of them is the methyl orange (MeO), an azoic dye considered to be a recalcitrant compound. The heterogeneous photocatalysis emerges as an alternative for treating this type of hazardous compounds, through the generation of OH radicals using radiation and a semiconductor oxide. According to the author’s knowledge, catalysts such as TiO2 doped with metals show high efficiency in degrading MeO; however, this presents economic limitations on industrial scale. Black sand can be considered as a naturally doped catalyst because in its structure is common to find compounds such as titanium, iron and aluminum oxides, also elements such as zircon, cadmium, manganese, etc. This study reports the photocatalytic activity of the mineral black sand used as semiconductor in the discoloration of MeO by oxidation and reduction photocatalytic techniques. For this, magnetic composites from the mineral were prepared (RM, M1, M2 and NM) and their activity were tested through MeO discoloration while TiO2 was used as reference. For the fractions, chemical, morphological and structural characterizations were performed using Scanning Electron Microscopy with Energy Dispersive X-Ray (SEM-EDX), X-Ray Diffraction (XRD) and X-Ray Fluorescence (XRF) analysis. M2 fraction showed higher MeO discoloration (93%) in oxidation conditions at pH 2 and it could be due to the presence of ferric oxides. However, the best result to reduction process was using M1 fraction (20%) at pH 2, which contains a higher titanium percentage. In the first process, hydrogen peroxide (H2O2) was used as electron donor agent. According to the results, black sand mineral can be used as natural semiconductor in photocatalytic process. It could be considered as a photocatalyst precursor in such processes, due to its low cost and easy access.

Keywords: Black sand mineral, methyl orange, oxidation, photocatalysis, reduction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1272
151 Evaluating the Validity of Computational Fluid Dynamics Model of Dispersion in a Complex Urban Geometry Using Two Sets of Experimental Measurements

Authors: Mohammad R. Kavian Nezhad, Carlos F. Lange, Brian A. Fleck

Abstract:

This research presents the validation study of a computational fluid dynamics (CFD) model developed to simulate the scalar dispersion emitted from rooftop sources around the buildings at the University of Alberta North Campus. The ANSYS CFX code was used to perform the numerical simulation of the wind regime and pollutant dispersion by solving the 3D steady Reynolds-averaged Navier-Stokes (RANS) equations on a building-scale high-resolution grid. The validation study was performed in two steps. First, the CFD model performance in 24 cases (eight wind directions and three wind speeds) was evaluated by comparing the predicted flow fields with the available data from the previous measurement campaign designed at the North Campus, using the standard deviation method (SDM), while the estimated results of the numerical model showed maximum average percent errors of approximately 53% and 37% for wind incidents from the North and Northwest, respectively. Good agreement with the measurements was observed for the other six directions, with an average error of less than 30%. In the second step, the reliability of the implemented turbulence model, numerical algorithm, modeling techniques, and the grid generation scheme was further evaluated using the Mock Urban Setting Test (MUST) dispersion dataset. Different statistical measures, including the fractional bias (FB), the mean geometric bias (MG), and the normalized mean square error (NMSE), were used to assess the accuracy of the predicted dispersion field. Our CFD results are in very good agreement with the field measurements.

Keywords: CFD, plume dispersion, complex urban geometry, validation study, wind flow.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 373
150 Optimization and Validation for Determination of VOCs from Lime Fruit Citrus aurantifolia (Christm.) with and without California Red Scale Aonidiella aurantii (Maskell) Infested by Using HS-SPME-GC-FID/MS

Authors: K. Mohammed, M. Agarwal, J. Mewman, Y. Ren

Abstract:

An optimum technic has been developed for extracting volatile organic compounds which contribute to the aroma of lime fruit (Citrus aurantifolia). The volatile organic compounds of healthy and infested lime fruit with California red scale Aonidiella aurantii were characterized using headspace solid phase microextraction (HS-SPME) combined with gas chromatography (GC) coupled flame ionization detection (FID) and gas chromatography with mass spectrometry (GC-MS) as a very simple, efficient and nondestructive extraction method. A three-phase 50/30 μm PDV/DVB/CAR fibre was used for the extraction process. The optimal sealing and fibre exposure time for volatiles reaching equilibrium from whole lime fruit in the headspace of the chamber was 16 and 4 hours respectively. 5 min was selected as desorption time of the three-phase fibre. Herbivorous activity induces indirect plant defenses, as the emission of herbivorous-induced plant volatiles (HIPVs), which could be used by natural enemies for host location. GC-MS analysis showed qualitative differences among volatiles emitted by infested and healthy lime fruit. The GC-MS analysis allowed the initial identification of 18 compounds, with similarities higher than 85%, in accordance with the NIST mass spectral library. One of these were increased by A. aurantii infestation, D-limonene, and three were decreased, Undecane, α-Farnesene and 7-epi-α-selinene. From an applied point of view, the application of the above-mentioned VOCs may help boost the efficiency of biocontrol programs and natural enemies’ production techniques.

Keywords: Lime fruit, Citrus aurantifolia, California red scale, Aonidiella aurantii, VOCs, HS-SPME/GC-FID-MS.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 859