Search results for: computer code
2583 FPGA Implementation of Novel Triangular Systolic Array Based Architecture for Determining the Eigenvalues of Matrix
Authors: Soumitr Sanjay Dubey, Shubhajit Roy Chowdhury, Rahul Shrestha
Abstract:
In this paper, we have presented a novel approach of calculating eigenvalues of any matrix for the first time on Field Programmable Gate Array (FPGA) using Triangular Systolic Arra (TSA) architecture. Conventionally, additional computation unit is required in the architecture which is compliant to the algorithm for determining the eigenvalues and this in return enhances the delay and power consumption. However, recently reported works are only dedicated for symmetric matrices or some specific case of matrix. This works presents an architecture to calculate eigenvalues of any matrix based on QR algorithm which is fully implementable on FPGA. For the implementation of QR algorithm we have used TSA architecture, which is further utilising CORDIC (CO-ordinate Rotation DIgital Computer) algorithm, to calculate various trigonometric and arithmetic functions involved in the procedure. The proposed architecture gives an error in the range of 10−4. Power consumption by the design is 0.598W. It can work at the frequency of 900 MHz.Keywords: coordinate rotation digital computer, three angle complex rotation, triangular systolic array, QR algorithm
Procedia PDF Downloads 4152582 A Semantical Investigation on Physician Assisted Suicide in Canada between 1993 and 2015
Authors: Gabrielle Pilliat
Abstract:
The Supreme Court of Canada rendered unconstitutional the sections of the Canadian Criminal Code which prohibited the Physician-assisted suicide in February 2015. However, in 1993, the same Supreme Court of Canada ruled that Physician-assisted suicide should remain absolutely prohibited. In the light of these historical facts, we will explore how the Supreme Court of Canada was able to make two different decisions 20 years apart. To understand how Canada could rule so differently between 1993 and 2015 about Physician-assisted suicide, we will analyze the content of the Supreme Court of Canada decisions’ discourse of 1993 and of 2015. Our preliminary results indicate that A) the patient autonomy (or the personal choice) has taken over the idea of the preservation of life (or the sacred character of life) in 2015. B) That between 1993 and 2015, the physician is seen differently by the Judges; like an abusive murderer in 1993 and like an objective evaluator in 2015. C) That the patient is seen as a victim in 1993 and more like a hero in 2015.Keywords: physician-assisted suicide, patient autonomy, choice, sacred character of life, dignity
Procedia PDF Downloads 2742581 Performance Analysis of Multichannel OCDMA-FSO Network under Different Pervasive Conditions
Authors: Saru Arora, Anurag Sharma, Harsukhpreet Singh
Abstract:
To meet the growing need of high data rate and bandwidth, various efforts has been made nowadays for the efficient communication systems. Optical Code Division Multiple Access over Free space optics communication system seems an effective role for providing transmission at high data rate with low bit error rate and low amount of multiple access interference. This paper demonstrates the OCDMA over FSO communication system up to the range of 7000 m at a data rate of 5 Gbps. Initially, the 8 user OCDMA-FSO system is simulated and pseudo orthogonal codes are used for encoding. Also, the simulative analysis of various performance parameters like power and core effective area that are having an effect on the Bit error rate (BER) of the system is carried out. The simulative analysis reveals that the length of the transmission is limited by the multi-access interference (MAI) effect which arises when the number of users increases in the system.Keywords: FSO, PSO, bit error rate (BER), opti system simulation, multiple access interference (MAI), q-factor
Procedia PDF Downloads 3662580 A Criterion for Evaluating Plastic Loads: Plastic Work-Tangent Criterion
Authors: Ying Zhang
Abstract:
In ASME Boiler and Pressure Vessel Code, the plastic load is defined by applying the twice elastic slope (TES) criterion of plastic collapse to a characteristic load-deformation curve for the vessel. Several other plastic criterion such as tangent intersection (TI) criterion, plastic work (PW) criterion have been proposed in the literature, but all exhibit a practical limitation: difficult to define the load parameter for vessels subject to several combined loads. An alternative criterion: plastic work-tangent (PWT) criterion for evaluating plastic load in pressure vessel design by analysis is presented in this paper. According to the plastic work-load curve, when the tangent variation is less than a given value in the plastic phase, the corresponding load is the plastic load. Application of the proposed criterion is illustrated by considering the elastic-plastic response of the lower head of reactor pressure vessel (RPV) and nozzle intersection of (RPV). It is proposed that this is because the PWT criterion more fully represents the constraining effect of material strain hardening on the spread of plastic deformation and more efficiently ton evaluating the plastic load.Keywords: plastic load, plastic work, strain hardening, plastic work-tangent criterion
Procedia PDF Downloads 3552579 Validation of Codes Dragon4 and Donjon4 by Calculating Keff of a Slowpoke-2 Reactor
Authors: Otman Jai, Otman Elhajjaji, Jaouad Tajmouati
Abstract:
Several neutronic calculation codes must be used to solve the equation for different levels of discretization which all necessitate a specific modelisation. This chain of such models, known as a calculation scheme, leads to the knowledge of the neutron flux in a reactor from its own geometry, its isotopic compositions and a cross-section library. Being small in size, the 'Slowpoke-2' reactor is difficult to model due to the importance of the leaking neutrons. In the paper, the simulation model is presented (geometry, cross section library, assumption, etc.), and the results obtained by DRAGON4/DONJON4 codes were compared to the calculations performed with Monte Carlo code MCNP using detailed geometrical model of the reactor and the experimental data. Criticality calculations have been performed to verify and validate the model. Since created model properly describes the reactor core, it can be used for calculations of reactor core parameters and for optimization of research reactor application.Keywords: transport equation, Dragon4, Donjon4, neutron flux, effective multiplication factor
Procedia PDF Downloads 4712578 Effects of the Non-Newtonian Viscosity of Blood on Flow Field in a Constricted Artery with a Porous Plaque
Authors: Maedeh Shojaeizadeh, Amirreza Yeganegi
Abstract:
Nowadays many people lose their lives due to cardiovascular diseases. Inappropriate food habits and lack of exercise expedite deposit process of fatty substances on inner surface of blood arteries. This abnormal lump disturbs uniform blood flow and reduces oxygen delivery to active organs. This work presents a numerical simulation of Non-Newtonian blood flow in a stenosis vessel. The vessel is considered as two dimensional channel and plaque area is modelled as a homogenous porous medium. To simulate blood flow reaction around stenosis region, we use C++ code and solve coupled Cauchy, Darcy, governing continuity and energy equations. The analyses results show that viscosity power (n) plays an important role in flow separation and the size of the eddy at the downstream edge of the plaque. It is also observed that with increasing (n) value, temperature discontinuity and likelihood of vessel rupture declined.Keywords: blood flow, computational fluid dynamic, porosity, power law fluid
Procedia PDF Downloads 4612577 Simulation of Optimum Sculling Angle for Adaptive Rowing
Authors: Pornthep Rachnavy
Abstract:
The purpose of this paper is twofold. First, we believe that there are a significant relationship between sculling angle and sculling style among adaptive rowing. Second, we introduce a methodology used for adaptive rowing, namely simulation, to identify effectiveness of adaptive rowing. For our study we simulate the arms only single scull of adaptive rowing. The method for rowing fastest under the 1000 meter was investigated by study sculling angle using the simulation modeling. A simulation model of a rowing system was developed using the Matlab software package base on equations of motion consist of many variation for moving the boat such as oars length, blade velocity and sculling style. The boat speed, power and energy consumption on the system were compute. This simulation modeling can predict the force acting on the boat. The optimum sculling angle was performing by computer simulation for compute the solution. Input to the model are sculling style of each rower and sculling angle. Outputs of the model are boat velocity at 1000 meter. The present study suggests that the optimum sculling angle exist depends on sculling styles. The optimum angle for blade entry and release with respect to the perpendicular through the pin of the first style is -57.00 and 22.0 degree. The optimum angle for blade entry and release with respect to the perpendicular through the pin of the second style is -57.00 and 22.0 degree. The optimum angle for blade entry and release with respect to the perpendicular through the pin of the third style is -51.57 and 28.65 degree. The optimum angle for blade entry and release with respect to the perpendicular through the pin of the fourth style is -45.84 and 34.38 degree. A theoretical simulation for rowing has been developed and presented. The results suggest that it may be advantageous for the rowers to select the sculling angles proper to sculling styles. The optimum sculling angles of the rower depends on the sculling styles made by each rower. The investigated of this paper can be concludes in three directions: 1;. There is the optimum sculling angle in arms only single scull of adaptive rowing. 2. The optimum sculling angles depend on the sculling styles. 3. Computer simulation of rowing can identify opportunities for improving rowing performance by utilizing the kinematic description of rowing. The freedom to explore alternatives in speed, thrust and timing with the computer simulation will provide the coach with a tool for systematic assessments of rowing technique In addition, the ability to use the computer to examine the very complex movements during rowing will help both the rower and the coach to conceptualize the components of movements that may have been previously unclear or even undefined.Keywords: simulation, sculling, adaptive, rowing
Procedia PDF Downloads 4652576 Digital Musical Organology: The Audio Games: The Question of “A-Musicological” Interfaces
Authors: Hervé Zénouda
Abstract:
This article seeks to shed light on an emerging creative field: "Audio games," at the crossroads between video games and computer music. Indeed, many applications, which propose entertaining audio-visual experiences with the objective of musical creation, are available today for different supports (game consoles, computers, cell phones). The originality of this field is the use of the gameplay of video games applied to music composition. Thus, composing music using interfaces but also cognitive logics that we qualify as "a-musicological" seem to us particularly interesting from the perspective of musical digital organology. This field raises questions about the representation of sound and musical structures and develops new instrumental gestures and strategies of musical composition. We will try in this article to define the characteristics of this field by highlighting some historical milestones (abstract cinema, game theory in music, actions, and graphic scores) as well as the novelties brought by digital technologies.Keywords: audio-games, video games, computer generated music, gameplay, interactivity, synesthesia, sound interfaces, relationships image/sound, audiovisual music
Procedia PDF Downloads 1132575 A Parallel Approach for 3D-Variational Data Assimilation on GPUs in Ocean Circulation Models
Authors: Rossella Arcucci, Luisa D'Amore, Simone Celestino, Giuseppe Scotti, Giuliano Laccetti
Abstract:
This work is the first dowel in a rather wide research activity in collaboration with Euro Mediterranean Center for Climate Changes, aimed at introducing scalable approaches in Ocean Circulation Models. We discuss designing and implementation of a parallel algorithm for solving the Variational Data Assimilation (DA) problem on Graphics Processing Units (GPUs). The algorithm is based on the fully scalable 3DVar DA model, previously proposed by the authors, which uses a Domain Decomposition approach (we refer to this model as the DD-DA model). We proceed with an incremental porting process consisting of 3 distinct stages: requirements and source code analysis, incremental development of CUDA kernels, testing and optimization. Experiments confirm the theoretic performance analysis based on the so-called scale up factor demonstrating that the DD-DA model can be suitably mapped on GPU architectures.Keywords: data assimilation, GPU architectures, ocean models, parallel algorithm
Procedia PDF Downloads 4132574 Robust and Dedicated Hybrid Cloud Approach for Secure Authorized Deduplication
Authors: Aishwarya Shekhar, Himanshu Sharma
Abstract:
Data deduplication is one of important data compression techniques for eliminating duplicate copies of repeating data, and has been widely used in cloud storage to reduce the amount of storage space and save bandwidth. In this process, duplicate data is expunged, leaving only one copy means single instance of the data to be accumulated. Though, indexing of each and every data is still maintained. Data deduplication is an approach for minimizing the part of storage space an organization required to retain its data. In most of the company, the storage systems carry identical copies of numerous pieces of data. Deduplication terminates these additional copies by saving just one copy of the data and exchanging the other copies with pointers that assist back to the primary copy. To ignore this duplication of the data and to preserve the confidentiality in the cloud here we are applying the concept of hybrid nature of cloud. A hybrid cloud is a fusion of minimally one public and private cloud. As a proof of concept, we implement a java code which provides security as well as removes all types of duplicated data from the cloud.Keywords: confidentiality, deduplication, data compression, hybridity of cloud
Procedia PDF Downloads 3842573 Examining Relationship between Programming Performance, Programming Self Efficacy and Math Success
Authors: Mustafa Ekici, Sacide Güzin Mazman
Abstract:
Programming is the one of ability in computer science fields which is generally perceived difficult by students and various individual differences have been implicated in that ability success. Although several factors that affect programming ability have been identified over the years, there is not still a full understanding of why some students learn to program easily and quickly while others find it complex and difficult. Programming self-efficacy and mathematic success are two of those essential individual differences which are handled as having important effect on the programming success. This study aimed to identify the relationship between programming performance, programming self efficacy and mathematics success. The study group is consisted of 96 undergraduates from Department of Econometrics of Uşak University. 38 (39,58%) of the participants are female while 58 (60,41%) of them are male. Study was conducted in the programming-I course during 2014-2015 fall term. Data collection tools are comprised of programming course final grades, programming self efficacy scale and a mathematics achievement test. Data was analyzed through correlation analysis. The result of study will be reported in the full text of the study.Keywords: programming performance, self efficacy, mathematic success, computer science
Procedia PDF Downloads 5022572 Application of a Hybrid Modified Blade Element Momentum Theory/Computational Fluid Dynamics Approach for Wine Turbine Aerodynamic Performances Prediction
Authors: Samah Laalej, Abdelfattah Bouatem
Abstract:
In the field of wind turbine blades, it is complicated to evaluate the aerodynamic performances through experimental measurements as it requires a lot of computing time and resources. Therefore, in this paper, a hybrid BEM-CFD numerical technique is developed to predict power and aerodynamic forces acting on the blades. Computational fluid dynamics (CFD) simulation was conducted to calculate the drag and lift forces through Ansys software using the K-w model. Then an enhanced BEM code was created to predict the power outputs generated by the wind turbine using the aerodynamic properties extracted from the CFD approach. The numerical approach was compared and validated with experimental data. The power curves calculated from this hybrid method were in good agreement with experimental measurements for all velocity ranges.Keywords: blade element momentum, aerodynamic forces, wind turbine blades, computational fluid dynamics approach
Procedia PDF Downloads 672571 Investigation of Roll-Off Factor in Pulse Shaping Filter on Maximal Ratio Combining for CDMA 2000 System
Authors: G. S. Walia, H. P. Singh, D. Padma
Abstract:
The integration of wide variety of communication services is made possible with invention of 3G technology. Code Division Multiple Access 2000 operates on various RF channel bandwidths 1.2288 or 3.6864 Mcps (1x or 3x systems). It is a 3G system which offers high bandwidth and wireless broadband services but its efficiency is lowered due to various factors like fading, interference, scattering, absorption etc. This paper investigates the effect of diversity (MRC), roll off factor in Root Raised Cosine (RRC) filter for the BPSK and QPSK modulation schemes. It is possible to transmit data with minimum Inter symbol Interference and within limited bandwidth with proper pulse shaping technique. Bit error rate (BER) performance is analyzed by applying diversity technique by varying the roll off factor for BPSK and QPSK. Roll off factor reduces the ISI and diversity reduces the Fading.Keywords: CDMA2000, root raised cosine, roll-off factor, ISI, diversity, interference, fading
Procedia PDF Downloads 4082570 Intelligent Software Architecture and Automatic Re-Architecting Based on Machine Learning
Authors: Gebremeskel Hagos Gebremedhin, Feng Chong, Heyan Huang
Abstract:
Software system is the combination of architecture and organized components to accomplish a specific function or set of functions. A good software architecture facilitates application system development, promotes achievement of functional requirements, and supports system reconfiguration. We describe three studies demonstrating the utility of our architecture in the subdomain of mobile office robots and identify software engineering principles embodied in the architecture. The main aim of this paper is to analyze prove architecture design and automatic re-architecting using machine learning. Intelligence software architecture and automatic re-architecting process is reorganizing in to more suitable one of the software organizational structure system using the user access dataset for creating relationship among the components of the system. The 3-step approach of data mining was used to analyze effective recovery, transformation and implantation with the use of clustering algorithm. Therefore, automatic re-architecting without changing the source code is possible to solve the software complexity problem and system software reuse.Keywords: intelligence, software architecture, re-architecting, software reuse, High level design
Procedia PDF Downloads 1202569 Automated Computer-Vision Analysis Pipeline of Calcium Imaging Neuronal Network Activity Data
Authors: David Oluigbo, Erik Hemberg, Nathan Shwatal, Wenqi Ding, Yin Yuan, Susanna Mierau
Abstract:
Introduction: Calcium imaging is an established technique in neuroscience research for detecting activity in neural networks. Bursts of action potentials in neurons lead to transient increases in intracellular calcium visualized with fluorescent indicators. Manual identification of cell bodies and their contours by experts typically takes 10-20 minutes per calcium imaging recording. Our aim, therefore, was to design an automated pipeline to facilitate and optimize calcium imaging data analysis. Our pipeline aims to accelerate cell body and contour identification and production of graphical representations reflecting changes in neuronal calcium-based fluorescence. Methods: We created a Python-based pipeline that uses OpenCV (a computer vision Python package) to accurately (1) detect neuron contours, (2) extract the mean fluorescence within the contour, and (3) identify transient changes in the fluorescence due to neuronal activity. The pipeline consisted of 3 Python scripts that could both be easily accessed through a Python Jupyter notebook. In total, we tested this pipeline on ten separate calcium imaging datasets from murine dissociate cortical cultures. We next compared our automated pipeline outputs with the outputs of manually labeled data for neuronal cell location and corresponding fluorescent times series generated by an expert neuroscientist. Results: Our results show that our automated pipeline efficiently pinpoints neuronal cell body location and neuronal contours and provides a graphical representation of neural network metrics accurately reflecting changes in neuronal calcium-based fluorescence. The pipeline detected the shape, area, and location of most neuronal cell body contours by using binary thresholding and grayscale image conversion to allow computer vision to better distinguish between cells and non-cells. Its results were also comparable to manually analyzed results but with significantly reduced result acquisition times of 2-5 minutes per recording versus 10-20 minutes per recording. Based on these findings, our next step is to precisely measure the specificity and sensitivity of the automated pipeline’s cell body and contour detection to extract more robust neural network metrics and dynamics. Conclusion: Our Python-based pipeline performed automated computer vision-based analysis of calcium image recordings from neuronal cell bodies in neuronal cell cultures. Our new goal is to improve cell body and contour detection to produce more robust, accurate neural network metrics and dynamic graphs.Keywords: calcium imaging, computer vision, neural activity, neural networks
Procedia PDF Downloads 832568 Development of Variable Order Block Multistep Method for Solving Ordinary Differential Equations
Authors: Mohamed Suleiman, Zarina Bibi Ibrahim, Nor Ain Azeany, Khairil Iskandar Othman
Abstract:
In this paper, a class of variable order fully implicit multistep Block Backward Differentiation Formulas (VOBBDF) using uniform step size for the numerical solution of stiff ordinary differential equations (ODEs) is developed. The code will combine three multistep block methods of order four, five and six. The order selection is based on approximation of the local errors with specific tolerance. These methods are constructed to produce two approximate solutions simultaneously at each iteration in order to further increase the efficiency. The proposed VOBBDF is validated through numerical results on some standard problems found in the literature and comparisons are made with single order Block Backward Differentiation Formula (BBDF). Numerical results shows the advantage of using VOBBDF for solving ODEs.Keywords: block backward differentiation formulas, uniform step size, ordinary differential equations
Procedia PDF Downloads 4472567 Numerical Modeling of the Seismic Site Response in the Firenze Metropolitan Area
Authors: Najmeh Ayoqi, Emanuele Marchetti
Abstract:
OpenSWPC was used to model 2D and 3D seismic waveforms produced by various earthquakes in the Firenze metropolitan area. OpenSWPC is an Opens source code for simulation of seismic wave by using the finite difference method (FDM) in Message Passing Interface (MPI) environment. it considered both earthquake sources, with variable magnitude and location, as well as a pulse source in the modeling domain, which is optimal to simulate local seismic amplification effects. Multiple tests were performed to evaluate the dependence of the frequency content of output modeled waveforms on the model grid size and time steps . Moreover the effect of the velocity structure and absorbing boundary condition on waveform features (amplitude, duration and frequency content) where analysed. Eventually model results are compared with real waveform and Horizontal-to-Vertical spectral Ratio (HVSR) , showing that seismic wave modeling can provide important information on seismic assessment in the city.Keywords: openSWPC, earthquake, firenze, HVSR, seismic wave
Procedia PDF Downloads 222566 Seismic Assessment of Old Existing RC Buildings In Madinah with Masonry Infilled Using Ambient Vibration Measurements
Authors: Tarek M. Alguhane, Ayman H. Khalil, Nour M. Fayed, Ayman M. Ismail
Abstract:
Early, pre-code, reinforced concrete structures present undetermined resistance to earthquakes. This situation is particularly unacceptable in the case of essential structures, such as healthcare structures and pilgrims' houses. Among these, existing old RC building in Madinah is seismically evaluated with and without infill wall and their dynamic characteristics are compared with measured values in the field using ambient vibration measurements (AVM). After, updating the mathematical models for this building with the experimental results, three dimensional pushover analysis (Nonlinear static analysis) was carried out using SAP 2000 software incorporating inelastic material properties for concrete, infill and steel. The purpose of this analysis is to evaluate the expected performance of structural systems by estimating, strength and deformation demands in design, and comparing these demands to available capacities at the performance levels of interest. The results are summarized and discussed.Keywords: seismic assessment, pushover analysis ambient vibration, modal update
Procedia PDF Downloads 4982565 Computer Simulation Studies of Spinel LiMn₂O₄ Nanotubes
Authors: D. M. Tshwane, R. R. Maphanga, P. E. Ngoepe
Abstract:
Nanostructured materials are attractive candidates for efficient electrochemical energy storage devices because of their unique physicochemical properties. Nanotubes have drawn a continuous attention because of their unique electrical, optical and magnetic properties contrast to that of bulk system. They have potential application in the field of optical, electronics and energy storage device. Introducing nanotubes structures as electrode materials; represents one of the most attractive strategies that could dramatically enhance the battery performance. Spinel LiMn2O4 is the most promising cathode material for Li-ion batteries. In this work, computer simulation methods are used to generate and investigate properties of spinel LiMn2O4 nanotubes. Molecular dynamic simulation is used to probe the local structure of LiMn2O4 nanotubes and the effect of temperature on these systems. It is found that diameter, Miller indices and size have a direct control on nanotubes morphology. Furthermore, it is noted that stability depends on surface and wrapping of the nanotube. The nanotube structures are described using the radial distribution function and XRD patterns. There is a correlation between calculated XRD and experimentally reported results.Keywords: LiMn2O4, li-ion batteries, nanotubes, nanostructures
Procedia PDF Downloads 1902564 Minimizing Mutant Sets by Equivalence and Subsumption
Authors: Samia Alblwi, Amani Ayad
Abstract:
Mutation testing is the art of generating syntactic variations of a base program and checking whether a candidate test suite can identify all the mutants that are not semantically equivalent to the base: this technique is widely used by researchers to select quality test suites. One of the main obstacles to the widespread use of mutation testing is cost: even small pro-grams (a few dozen lines of code) can give rise to a large number of mutants (up to hundreds): this has created an incentive to seek to reduce the number of mutants while preserving their collective effectiveness. Two criteria have been used to reduce the size of mutant sets: equiva-lence, which aims to partition the set of mutants into equivalence classes modulo semantic equivalence, and selecting one representative per class; subsumption, which aims to define a partial ordering among mutants that ranks mutants by effectiveness and seeks to select maximal elements in this ordering. In this paper we analyze these two policies using analytical and em-pirical criteria.Keywords: mutation testing, mutant sets, mutant equivalence, mutant subsumption, mutant set minimization
Procedia PDF Downloads 642563 Behavior Study of Concrete-Filled Thin-Walled Square Hollow Steel Stub Columns
Authors: Mostefa Mimoune
Abstract:
Test results on concrete-filled steel tubular stub columns under axial compression are presented. The study was mainly focused on square hollow section SHS columns; 27 columns were tested. The main experimental parameters considered were the thickness of the tube, columns length and cross section sizes. Existing design codes and theoretical model were used to predict load-carrying capacities of composite section to compare the accuracy of the predictions by using the recommendations of DTR-BC (Algerian code), CSA (Canadian standard), AIJ, EC4, DBJ, AISC, BS and EC4. Experimental results indicate that the studied parameters have significant influence on both the compressive load capacity and the column failure mode. All codes used in the comparison, provide higher resistance compared to those of tests. Equation method has been suggested to evaluate the axial capacity of the composite section seem to be in agreement with tests.Keywords: axial loading, composite section, concrete-filled steel tubes, square hollow section
Procedia PDF Downloads 3792562 When English Learners Speak “Non-Standard” English
Authors: Gloria Chen
Abstract:
In the past, when we complimented someone who had a good command of English, we would say ‘She/He speaks/writes standard English,’ or ‘His/Her English is standard.’ However, with English has becoming a ‘global language,’ many scholars and English users even create a plural form for English as ‘world Englishes,’ which indicates that national/racial varieties of English not only exist, but also are accepted to a certain degree. Now, a question will be raised when it comes to English teaching and learning: ‘What variety/varieties of English should be taught?’ This presentation will first explore Braj Kachru’s well-known categorization of the inner circle, the outer circle, and the expanding circle of English users, as well as inner circle varieties such as ‘Ebonics’ and ‘cockney’. The presentation then will discuss the purposes and contexts of English learning, and apply different approaches to different purposes and contexts. Three major purposes of English teaching/learning will be emphasized and considered: (1) communicative competence, (2) academic competence, and (3) intercultural competence. This presentation will complete with the strategies of ‘code switch’ and ‘register switch’ in teaching English to non-standard English speakers in both speaking and writing.Keywords: world Englishes, standard and non-standard English, inner, outer, expanded circle communicative, academic, intercultural competence
Procedia PDF Downloads 2652561 A Dynamic Software Product Line Approach to Self-Adaptive Genetic Algorithms
Authors: Abdelghani Alidra, Mohamed Tahar Kimour
Abstract:
Genetic algorithm must adapt themselves at design time to cope with the search problem specific requirements and at runtime to balance exploration and convergence objectives. In a previous article, we have shown that modeling and implementing Genetic Algorithms (GA) using the software product line (SPL) paradigm is very appreciable because they constitute a product family sharing a common base of code. In the present article we propose to extend the use of the feature model of the genetic algorithms family to model the potential states of the GA in what is called a Dynamic Software Product Line. The objective of this paper is the systematic generation of a reconfigurable architecture that supports the dynamic of the GA and which is easily deduced from the feature model. The resultant GA is able to perform dynamic reconfiguration autonomously to fasten the convergence process while producing better solutions. Another important advantage of our approach is the exploitation of recent advances in the domain of dynamic SPLs to enhance the performance of the GAs.Keywords: self-adaptive genetic algorithms, software engineering, dynamic software product lines, reconfigurable architecture
Procedia PDF Downloads 2852560 Distributed Energy System - Microgrid Integration of Hybrid Power Systems
Authors: Pedro Esteban
Abstract:
Planning a hybrid power system (HPS) that integrates renewable generation sources, non-renewable generation sources and energy storage, involves determining the capacity and size of various components to be used in the system to be able to supply reliable electricity to the connected load as required. Nowadays it is very common to integrate solar photovoltaic (PV) power plants for renewable generation as part of HPS. The solar PV system is usually balanced via a second form of generation (renewable such as wind power or using fossil fuels such as a diesel generator) or an energy storage system (such as a battery bank). Hybrid power systems can also provide other forms of power such as heat for some applications. Modern hybrid power systems combine power generation and energy storage technologies together with real-time energy management and innovative power quality and energy efficiency improvement functionalities. These systems help customers achieve targets for clean energy generation, they add flexibility to the electrical grid, and they optimize the installation by improving its power quality and energy efficiency.Keywords: microgrids, hybrid power systems, energy storage, grid code compliance
Procedia PDF Downloads 1472559 Generic Hybrid Models for Two-Dimensional Ultrasonic Guided Wave Problems
Authors: Manoj Reghu, Prabhu Rajagopal, C. V. Krishnamurthy, Krishnan Balasubramaniam
Abstract:
A thorough understanding of guided ultrasonic wave behavior in structures is essential for the application of existing Non Destructive Evaluation (NDE) technologies, as well as for the development of new methods. However, the analysis of guided wave phenomena is challenging because of their complex dispersive and multimodal nature. Although numerical solution procedures have proven to be very useful in this regard, the increasing complexity of features and defects to be considered, as well as the desire to improve the accuracy of inspection often imposes a large computational cost. Hybrid models that combine numerical solutions for wave scattering with faster alternative methods for wave propagation have long been considered as a solution to this problem. However usually such models require modification of the base code of the solution procedure. Here we aim to develop Generic Hybrid models that can be directly applied to any two different solution procedures. With this goal in mind, a Numerical Hybrid model and an Analytical-Numerical Hybrid model has been developed. The concept and implementation of these Hybrid models are discussed in this paper.Keywords: guided ultrasonic waves, Finite Element Method (FEM), Hybrid model
Procedia PDF Downloads 4672558 Improved Processing Speed for Text Watermarking Algorithm in Color Images
Authors: Hamza A. Al-Sewadi, Akram N. A. Aldakari
Abstract:
Copyright protection and ownership proof of digital multimedia are achieved nowadays by digital watermarking techniques. A text watermarking algorithm for protecting the property rights and ownership judgment of color images is proposed in this paper. Embedding is achieved by inserting texts elements randomly into the color image as noise. The YIQ image processing model is found to be faster than other image processing methods, and hence, it is adopted for the embedding process. An optional choice of encrypting the text watermark before embedding is also suggested (in case required by some applications), where, the text can is encrypted using any enciphering technique adding more difficulty to hackers. Experiments resulted in embedding speed improvement of more than double the speed of other considered systems (such as least significant bit method, and separate color code methods), and a fairly acceptable level of peak signal to noise ratio (PSNR) with low mean square error values for watermarking purposes.Keywords: steganography, watermarking, time complexity measurements, private keys
Procedia PDF Downloads 1442557 Deployment of Information and Communication Technology (ICT) to Reduce Occurrences of Terrorism in Nigeria
Authors: Okike Benjamin
Abstract:
Terrorism is the use of violence and threat to intimidate or coerce a person, group, society or even government especially for political purposes. Terrorism may be a way of resisting government by some group who may feel marginalized. It could also be a way of expressing displeasure over the activities of government. On 26th December, 2009, US placed Nigeria as a terrorist nation. Recently, the occurrences of terrorism in Nigeria have increased considerably. In Jos, Plateau state, Nigeria, there was a bomb blast which claimed many lives on the eve of 2010 Christmas. Similarly, there was another bomb blast in Mugadishi (Sani Abacha) Barracks Mammy market on the eve of 2011 New Year. For some time now, it is no longer news that bomb exploded in some Northern part of Nigeria. About 25 years ago, stopping terrorism in America by the Americans relied on old-fashioned tools such as strict physical security at vulnerable places, intelligence gathering by government agents, or individuals, vigilance on the part of all citizens, and a sense of community in which citizens do what could be done to protect each other. Just as technology has virtually been used to better the way many other things are done, so also this powerful new weapon called computer technology can be used to detect and prevent terrorism not only in Nigeria, but all over the world. This paper will x-ray the possible causes and effects of bomb blast, which is an act of terrorism and suggest ways in which Explosive Detection Devices (EDDs) and computer software technology could be deployed to reduce the occurrences of terrorism in Nigeria. This become necessary with the abduction of over 200 schoolgirls in Chibok, Borno State from their hostel by members of Boko Haram sect members on 14th April, 2014. Presently, Barrack Obama and other world leaders have sent some of their military personnel to help rescue those innocent schoolgirls whose offence is simply seeking to acquire western education which the sect strongly believe is forbidden.Keywords: terrorism, bomb blast, computer technology, explosive detection devices, Nigeria
Procedia PDF Downloads 2632556 Vibration of Gamma Graphyne with an Attached Mass
Authors: Win-Jin Chang, Haw-Long Lee, Yu-Ching Yang
Abstract:
Atomic finite element simulation is applied to investigate the vibration frequency of a single-layer gamma graphyne with an attached mass for the CCCC, SSSS, CFCF, SFSF boundary conditions using the commercial code ANSYS. The fundamental frequencies of the graphyne sheet are compared with the results of the previous study. The results of the comparison are very good in all considered cases. The attached mass causes a shift in the resonant frequency of the graphyne. The frequencies of the single-layer gamma graphyne with an attached mass for different boundary conditions are obtained, and the order based on the boundary condition is CCCC >SSSS > CFCF> SFSF. The highest frequency shift is obtained when the attached mass is located at the center of the graphyne sheet. This is useful for the design of a highly sensitive graphyne-based mass sensor.Keywords: graphyne, finite element analysis, vibration analysis, frequency shift
Procedia PDF Downloads 2122555 Wind Tunnel Tests on Ground-Mounted and Roof-Mounted Photovoltaic Array Systems
Authors: Chao-Yang Huang, Rwey-Hua Cherng, Chung-Lin Fu, Yuan-Lung Lo
Abstract:
Solar energy is one of the replaceable choices to reduce the CO2 emission produced by conventional power plants in the modern society. As an island which is frequently visited by strong typhoons and earthquakes, it is an urgent issue for Taiwan to make an effort in revising the local regulations to strengthen the safety design of photovoltaic systems. Currently, the Taiwanese code for wind resistant design of structures does not have a clear explanation on photovoltaic systems, especially when the systems are arranged in arrayed format. Furthermore, when the arrayed photovoltaic system is mounted on the rooftop, the approaching flow is significantly altered by the building and led to different pressure pattern in the different area of the photovoltaic system. In this study, L-shape arrayed photovoltaic system is mounted on the ground of the wind tunnel and then mounted on the building rooftop. The system is consisted of 60 PV models. Each panel model is equivalent to a full size of 3.0 m in depth and 10.0 m in length. Six pressure taps are installed on the upper surface of the panel model and the other six are on the bottom surface to measure the net pressures. Wind attack angle is varied from 0° to 360° in a 10° interval for the worst concern due to wind direction. The sampling rate of the pressure scanning system is set as high enough to precisely estimate the peak pressure and at least 20 samples are recorded for good ensemble average stability. Each sample is equivalent to 10-minute time length in full scale. All the scale factors, including timescale, length scale, and velocity scale, are properly verified by similarity rules in low wind speed wind tunnel environment. The purpose of L-shape arrayed system is for the understanding the pressure characteristics at the corner area. Extreme value analysis is applied to obtain the design pressure coefficient for each net pressure. The commonly utilized Cook-and-Mayne coefficient, 78%, is set to the target non-exceedance probability for design pressure coefficients under Gumbel distribution. Best linear unbiased estimator method is utilized for the Gumbel parameter identification. Careful time moving averaging method is also concerned in data processing. Results show that when the arrayed photovoltaic system is mounted on the ground, the first row of the panels reveals stronger positive pressure than that mounted on the rooftop. Due to the flow separation occurring at the building edge, the first row of the panels on the rooftop is most in negative pressures; the last row, on the other hand, shows positive pressures because of the flow reattachment. Different areas also have different pressure patterns, which corresponds well to the regulations in ASCE7-16 describing the area division for design values. Several minor observations are found according to parametric studies, such as rooftop edge effect, parapet effect, building aspect effect, row interval effect, and so on. General comments are then made for the proposal of regulation revision in Taiwanese code.Keywords: aerodynamic force coefficient, ground-mounted, roof-mounted, wind tunnel test, photovoltaic
Procedia PDF Downloads 1392554 Sociolinguistics and Language Change
Authors: Banazzouz Halima
Abstract:
Throughout the ages, language has been viewed not only as a simple code of communicating information but rather as the most powerful and versatile medium of maintaining relationships with other people. While,by the end of the 18th century, such matters of scientific investigation concerning the study of human language began to occur under the scope of “Linguistics” generally defined as the scientific study of language. Linguistics, thus, provides a growing body of scientific knowledge about language which can guide the activity of the language teacher and student as well. Moreover,as times passed, the linguistic development engaged language in a broadly practiced academic discipline having relationship with other sciences such as: psychology, sociology, anthropology etc. Therefore, “Sociolinguistics” was given birth during the 1960’s. In fact, the given abstract is mainly linguistic, inserted under the scope of “Sociolinguistics” and by far it highlights on the process of linguistic variation and language change to show that all languages change through time and linguistic systems may vary from one speech community to another providing there is a sense of vitality where people of different parts of the globe may mutually and intelligibly communicate and comprehend each other.Keywords: language change-sociolinguistics, social context-speech community, vitality of language, linguistic variation, urban dialectology, urban dialectology
Procedia PDF Downloads 629