Search results for: computer code
2403 Randomness in Cybertext: A Study on Computer-Generated Poetry from the Perspective of Semiotics
Authors: Hongliang Zhang
Abstract:
The use of chance procedures and randomizers in poetry-writing can be traced back to surrealist works, which, by appealing to Sigmund Freud's theories, were still logocentrism. In the 1960s, random permutation and combination were extensively used by the Oulipo, John Cage and Jackson Mac Low, which further deconstructed the metaphysical presence of writing. Today, the randomly-generated digital poetry has emerged as a genre of cybertext which should be co-authored by readers. At the same time, the classical theories have now been updated by cybernetics and media theories. N· Katherine Hayles put forward the concept of ‘the floating signifiers’ by Jacques Lacan to be the ‘the flickering signifiers’ , arguing that the technology per se has become a part of the textual production. This paper makes a historical review of the computer-generated poetry in the perspective of semiotics, emphasizing that the randomly-generated digital poetry which hands over the dual tasks of both interpretation and writing to the readers demonstrates the intervention of media technology in literature. With the participation of computerized algorithm and programming languages, poems randomly generated by computers have not only blurred the boundary between encoder and decoder, but also raises the issue of human-machine. It is also a significant feature of the cybertext that the productive process of the text is full of randomness.Keywords: cybertext, digital poetry, poetry generator, semiotics
Procedia PDF Downloads 1752402 Identification of EEG Attention Level Using Empirical Mode Decompositions for BCI Applications
Authors: Chia-Ju Peng, Shih-Jui Chen
Abstract:
This paper proposes a method to discriminate electroencephalogram (EEG) signals between different concentration states using empirical mode decomposition (EMD). Brain-computer interface (BCI), also called brain-machine interface, is a direct communication pathway between the brain and an external device without the inherent pathway such as the peripheral nervous system or skeletal muscles. Attention level is a common index as a control signal of BCI systems. The EEG signals acquired from people paying attention or in relaxation, respectively, are decomposed into a set of intrinsic mode functions (IMF) by EMD. Fast Fourier transform (FFT) analysis is then applied to each IMF to obtain the frequency spectrums. By observing power spectrums of IMFs, the proposed method has the better identification of EEG attention level than the original EEG signals between different concentration states. The band power of IMF3 is the most obvious especially in β wave, which corresponds to fully awake and generally alert. The signal processing method and results of this experiment paves a new way for BCI robotic system using the attention-level control strategy. The integrated signal processing method reveals appropriate information for discrimination of the attention and relaxation, contributing to a more enhanced BCI performance.Keywords: biomedical engineering, brain computer interface, electroencephalography, rehabilitation
Procedia PDF Downloads 3922401 A Novel Way to Create Qudit Quantum Error Correction Codes
Authors: Arun Moorthy
Abstract:
Quantum computing promises to provide algorithmic speedups for a number of tasks; however, similar to classical computing, effective error-correcting codes are needed. Current quantum computers require costly equipment to control each particle, so having fewer particles to control is ideal. Although traditional quantum computers are built using qubits (2-level systems), qudits (more than 2-levels) are appealing since they can have an equivalent computational space using fewer particles, meaning fewer particles need to be controlled. Currently, qudit quantum error-correction codes are available for different level qudit systems; however, these codes have sometimes overly specific constraints. When building a qudit system, it is important for researchers to have access to many codes to satisfy their requirements. This project addresses two methods to increase the number of quantum error correcting codes available to researchers. The first method is generating new codes for a given set of parameters. The second method is generating new error-correction codes by using existing codes as a starting point to generate codes for another level (i.e., a 5-level system code on a 2-level system). So, this project builds a website that researchers can use to generate new error-correction codes or codes based on existing codes.Keywords: qudit, error correction, quantum, qubit
Procedia PDF Downloads 1612400 Worldwide GIS Based Earthquake Information System/Alarming System for Microzonation/Liquefaction and It’s Application for Infrastructure Development
Authors: Rajinder Kumar Gupta, Rajni Kant Agrawal, Jaganniwas
Abstract:
One of the most frightening phenomena of nature is the occurrence of earthquake as it has terrible and disastrous effects. Many earthquakes occur every day worldwide. There is need to have knowledge regarding the trends in earthquake occurrence worldwide. The recoding and interpretation of data obtained from the establishment of the worldwide system of seismological stations made this possible. From the analysis of recorded earthquake data, the earthquake parameters and source parameters can be computed and the earthquake catalogues can be prepared. These catalogues provide information on origin, time, epicenter locations (in term of latitude and longitudes) focal depths, magnitude and other related details of the recorded earthquakes. Theses catalogues are used for seismic hazard estimation. Manual interpretation and analysis of these data is tedious and time consuming. A geographical information system is a computer based system designed to store, analyzes and display geographic information. The implementation of integrated GIS technology provides an approach which permits rapid evaluation of complex inventor database under a variety of earthquake scenario and allows the user to interactively view results almost immediately. GIS technology provides a powerful tool for displaying outputs and permit to users to see graphical distribution of impacts of different earthquake scenarios and assumptions. An endeavor has been made in present study to compile the earthquake data for the whole world in visual Basic on ARC GIS Plate form so that it can be used easily for further analysis to be carried out by earthquake engineers. The basic data on time of occurrence, location and size of earthquake has been compiled for further querying based on various parameters. A preliminary analysis tool is also provided in the user interface to interpret the earthquake recurrence in region. The user interface also includes the seismic hazard information already worked out under GHSAP program. The seismic hazard in terms of probability of exceedance in definite return periods is provided for the world. The seismic zones of the Indian region are included in the user interface from IS 1893-2002 code on earthquake resistant design of buildings. The City wise satellite images has been inserted in Map and based on actual data the following information could be extracted in real time: • Analysis of soil parameters and its effect • Microzonation information • Seismic hazard and strong ground motion • Soil liquefaction and its effect in surrounding area • Impacts of liquefaction on buildings and infrastructure • Occurrence of earthquake in future and effect on existing soil • Propagation of earth vibration due of occurrence of Earthquake GIS based earthquake information system has been prepared for whole world in Visual Basic on ARC GIS Plate form and further extended micro level based on actual soil parameters. Individual tools has been developed for liquefaction, earthquake frequency etc. All information could be used for development of infrastructure i.e. multi story structure, Irrigation Dam & Its components, Hydro-power etc in real time for present and future.Keywords: GIS based earthquake information system, microzonation, analysis and real time information about liquefaction, infrastructure development
Procedia PDF Downloads 3172399 First-Principles Calculations and Thermo-Calc Study of the Elastic and Thermodynamic Properties of Ti-Nb-ZR-Ta Alloy for Biomedical Applications
Authors: M. Madigoe, R. Modiba
Abstract:
High alloyed beta (β) phase-stabilized titanium alloys are known to have a low elastic modulus comparable to that of the human bone (≈30 GPa). The β phase in titanium alloys exhibits an elastic Young’s modulus of about 60-80 GPa, which is nearly half that of α-phase (100-120 GPa). In this work, a theoretical investigation of structural stability and thermodynamic stability, as well as the elastic properties of a quaternary Ti-Nb-Ta-Zr alloy, will be presented with an attempt to lower Young’s modulus. The structural stability and elastic properties of the alloy were evaluated using the first-principles approach within the density functional theory (DFT) framework implemented in the CASTEP code. The elastic properties include bulk modulus B, elastic Young’s modulus E, shear modulus cʹ and Poisson’s ratio v. Thermodynamic stability, as well as the fraction of β phase in the alloy, was evaluated using the Thermo-Calc software package. Thermodynamic properties such as Gibbs free energy (Δ?⁰?) and enthalpy of formation will be presented in addition to phase proportion diagrams. The stoichiometric compositions of the alloy is Ti-Nbx-Ta5-Zr5 (x = 5, 10, 20, 30, 40 at.%). An optimum alloy composition must satisfy the Born stability criteria and also possess low elastic Young’s modulus. In addition, the alloy must be thermodynamically stable, i.e., Δ?⁰? < 0.Keywords: elastic modulus, phase proportion diagram, thermo-calc, titanium alloys
Procedia PDF Downloads 1872398 Use of the Gas Chromatography Method for Hydrocarbons' Quality Evaluation in the Offshore Fields of the Baltic Sea
Authors: Pavel Shcherban, Vlad Golovanov
Abstract:
Currently, there is an active geological exploration and development of the subsoil shelf of the Kaliningrad region. To carry out a comprehensive and accurate assessment of the volumes and degree of extraction of hydrocarbons from open deposits, it is necessary to establish not only a number of geological and lithological characteristics of the structures under study, but also to determine the oil quality, its viscosity, density, fractional composition as accurately as possible. In terms of considered works, gas chromatography is one of the most capacious methods that allow the rapid formation of a significant amount of initial data. The aspects of the application of the gas chromatography method for determining the chemical characteristics of the hydrocarbons of the Kaliningrad shelf fields are observed in the article, as well as the correlation-regression analysis of these parameters in comparison with the previously obtained chemical characteristics of hydrocarbon deposits located on the land of the region. In the process of research, a number of methods of mathematical statistics and computer processing of large data sets have been applied, which makes it possible to evaluate the identity of the deposits, to specify the amount of reserves and to make a number of assumptions about the genesis of the hydrocarbons under analysis.Keywords: computer processing of large databases, correlation-regression analysis, hydrocarbon deposits, method of gas chromatography
Procedia PDF Downloads 1572397 Evaluation of Cyclic Thermo-Mechanical Responses of an Industrial Gas Turbine Rotor
Authors: Y. Rae, A. Benaarbia, J. Hughes, Wei Sun
Abstract:
This paper describes an elasto-visco-plastic computational modelling method which can be used to assess the cyclic plasticity responses of high temperature structures operating under thermo-mechanical loadings. The material constitutive equation used is an improved unified multi-axial Chaboche-Lemaitre model, which takes into account non-linear kinematic and isotropic hardening. The computational methodology is a three-dimensional framework following an implicit formulation and based on a radial return mapping algorithm. The associated user material (UMAT) code is developed and calibrated across isothermal hold-time low cycle fatigue tests for a typical turbine rotor steel for use in finite element (FE) implementation. The model is applied to a realistic industrial gas turbine rotor, where the study focuses its attention on the deformation heterogeneities and critical high stress areas within the rotor structure. The potential improvements of such FE visco-plastic approach are discussed. An integrated life assessment procedure based on R5 and visco-plasticity modelling, is also briefly addressed.Keywords: unified visco-plasticity, thermo-mechanical, turbine rotor, finite element modelling
Procedia PDF Downloads 1322396 Acceleration of Lagrangian and Eulerian Flow Solvers via Graphics Processing Units
Authors: Pooya Niksiar, Ali Ashrafizadeh, Mehrzad Shams, Amir Hossein Madani
Abstract:
There are many computationally demanding applications in science and engineering which need efficient algorithms implemented on high performance computers. Recently, Graphics Processing Units (GPUs) have drawn much attention as compared to the traditional CPU-based hardware and have opened up new improvement venues in scientific computing. One particular application area is Computational Fluid Dynamics (CFD), in which mature CPU-based codes need to be converted to GPU-based algorithms to take advantage of this new technology. In this paper, numerical solutions of two classes of discrete fluid flow models via both CPU and GPU are discussed and compared. Test problems include an Eulerian model of a two-dimensional incompressible laminar flow case and a Lagrangian model of a two phase flow field. The CUDA programming standard is used to employ an NVIDIA GPU with 480 cores and a C++ serial code is run on a single core Intel quad-core CPU. Up to two orders of magnitude speed up is observed on GPU for a certain range of grid resolution or particle numbers. As expected, Lagrangian formulation is better suited for parallel computations on GPU although Eulerian formulation represents significant speed up too.Keywords: CFD, Eulerian formulation, graphics processing units, Lagrangian formulation
Procedia PDF Downloads 4182395 Research Study on the Environmental Conditions in the Foreign
Authors: Vahid Bairami Rad, Shapoor Norazar, Moslem Talebi Asl
Abstract:
The fast growing accessibility and capability of emerging technologies have fashioned enormous possibilities of designing, developing and implementing innovative teaching methods in the classroom. Using teaching methods and technology together have a fantastic results, because the global technological scenario has paved the way to new pedagogies in teaching-learning process. At the other side methods by focusing on students and the ways of learning in them, that can demonstrate logical ways of improving student achievement in English as a foreign language in Iran. The sample of study was 90 students of 10th grade of high school located in Ardebil. A pretest-posttest equivalent group designed to compare the achievement of groups. Students divided to 3 group, Control base, computer base, method and technology base. Pretest and post test contain 30 items each from English textbook were developed and administrated, then obtained data were analyzed. The results showed that there was an important difference. The 3rd group performance was better than other groups. On the basis of this result it was obviously counseled that teaching-learning capabilities.Keywords: method, technology based environment, computer based environment, english as a foreign language, student achievement
Procedia PDF Downloads 4742394 Functions of Bilingualism in Hong Kong: Comparing the Linguistic Landscape of Tsim Sha Tsui and Tai Wai
Authors: Xinyi Huang
Abstract:
As a former British colony and one of the most famous world financial centers today, Hong Kong attracts countless businessmen and tourists to visit or settle down every year. Hong Kong is a land that leads western culture to blossom in Asia, and in the meantime, it inherits the unique charm of Chinese traditional culture. The Chinese-English bilingual phenomenon can be seen everywhere in Hong Kong. The public presentation, code choice, and practical use of these two languages can also reflect the economic and social status, population distribution, and individual identity construction of a specific area. This paper mainly compares the linguistic landscape of two areas with different social functions in Hong Kong: Tsim Sha Tsui, a large commercial center in Kowloon, and Tai Wai, a residential area in New Territories. By adopting the methodology of the Walking Tour, the bilingual data of 75 photos are collected unintentionally during the field trip in the two areas. Through the methods of quantitative analysis and linguistic landscape studies, this paper deeply analyzes the similarities and differences in language distribution and the respective social functions of two languages in the two places.Keywords: bilingualism, linguistic landscape, identity construction, commodification
Procedia PDF Downloads 1582393 Tracking Performance Evaluation of Robust Back-Stepping Control Design for a Nonlinear Electro-Hydraulic Servo System
Authors: Maria Ahmadnezhad, Mohammad Reza Soltanpour
Abstract:
Electrohydraulic servo systems have been used in industry in a wide number of applications. Its dynamics are highly nonlinear and also have large extent of model uncertainties and external disturbances. In this thesis, a robust back-stepping control (RBSC) scheme is proposed to overcome the problem of disturbances and system uncertainties effectively and to improve the tracking performance of EHS systems. In order to implement the proposed control scheme, the system uncertainties in EHS systems are considered as total leakage coefficient and effective oil volume. In addition, in order to obtain the virtual controls for stabilizing system, the update rule for the system uncertainty term is induced by the Lyapunov control function (LCF). To verify the performance and robustness of the proposed control system, computer simulation of the proposed control system using Matlab/Simulink Software is executed. From the computer simulation, it was found that the RBSC system produces the desired tracking performance and has robustness to the disturbances and system uncertainties of EHS systems.Keywords: electro hydraulic servo system, back-stepping control, robust back-stepping control, Lyapunov redesign
Procedia PDF Downloads 2972392 The Role of Intermediaries in E-Government Adoption in India: Bridging the Digital Divide
Authors: Rajiv Kumar, Amit Sachan, Arindam Mukherjee
Abstract:
Despite the transparency and benefits of e-government, and its potential to serve citizens better, there is low diffusion and adoption of e-government services in India. Limited access to computer and internet, lack of computer and internet skills, low trust in technology, and risk associated in using e-government services are major hindrances in e-government adoption in India. Despite a large number of citizens belonging to the non-adopter category, the government has made some services mandatory to be accessed online where citizens have no other choice. Also despite the digital divide, a large number of citizens prefer online access to government services. In such cases intermediaries like common service centers, internet café and services agents’ roles are significant for accessing e-government services. Hence research is needed to explore this. The study aims to investigate the role of intermediaries in online access to public services by citizens. Qualitative research methodology using semi-structured interview was used. The results show that intermediaries play an important role in bridging the digital divide. The study also highlights on what circumstances citizens are taking help of these intermediaries. The study then highlights its limitations and discusses scope for future study.Keywords: adoption, digital divide, e-government, India, intermediaries
Procedia PDF Downloads 2982391 Algorithmic Generation of Carbon Nanochimneys
Authors: Sorin Muraru
Abstract:
Computational generation of carbon nanostructures is still a very demanding process. This work provides an alternative to manual molecular modeling through an algorithm meant to automate the design of such structures. Specifically, carbon nanochimneys are obtained through the bonding of a carbon nanotube with the smaller edge of an open carbon nanocone. The methods of connection rely on mathematical, geometrical and chemical properties. Non-hexagonal rings are used in order to perform the correct bonding of dangling bonds. Once obtained, they are useful for thermal transport, gas storage or other applications such as gas separation. The carbon nanochimneys are meant to produce a less steep connection between structures such as the carbon nanotube and graphene sheet, as in the pillared graphene, but can also provide functionality on its own. The method relies on connecting dangling bonds at the edges of the two carbon nanostructures, employing the use of two different types of auxiliary structures on a case-by-case basis. The code is implemented in Python 3.7 and generates an output file in the .pdb format containing all the system’s coordinates. Acknowledgment: This work was supported by a grant of the Executive Agency for Higher Education, Research, Development and innovation funding (UEFISCDI), project number PN-III-P1-1.1-TE-2016-24-2, contract TE 122/2018.Keywords: carbon nanochimneys, computational, carbon nanotube, carbon nanocone, molecular modeling, carbon nanostructures
Procedia PDF Downloads 1712390 High Performance Field Programmable Gate Array-Based Stochastic Low-Density Parity-Check Decoder Design for IEEE 802.3an Standard
Authors: Ghania Zerari, Abderrezak Guessoum, Rachid Beguenane
Abstract:
This paper introduces high-performance architecture for fully parallel stochastic Low-Density Parity-Check (LDPC) field programmable gate array (FPGA) based LDPC decoder. The new approach is designed to decrease the decoding latency and to reduce the FPGA logic utilisation. To accomplish the target logic utilisation reduction, the routing of the proposed sub-variable node (VN) internal memory is designed to utilize one slice distributed RAM. Furthermore, a VN initialization, using the channel input probability, is achieved to enhance the decoder convergence, without extra resources and without integrating the output saturated-counters. The Xilinx FPGA implementation, of IEEE 802.3an standard LDPC code, shows that the proposed decoding approach attain high performance along with reduction of FPGA logic utilisation.Keywords: low-density parity-check (LDPC) decoder, stochastic decoding, field programmable gate array (FPGA), IEEE 802.3an standard
Procedia PDF Downloads 2972389 Numerical Study for the Estimation of Hydrodynamic Current Drag Coefficients for the Colombian Navy Frigates Using Computational Fluid Dynamics
Authors: Mauricio Gracia, Luis Leal, Bharat Verma
Abstract:
Computational fluid dynamics (CFD) has become nowadays an important tool in the process of hydrodynamic design of modern ships. CFD is used to model any phenomena related to fluid flow in a control volume like a ship or any offshore structure in the sea. In the present study, the current force drag coefficients for a Colombian Navy Frigate in deep and shallow water are estimated through the application of CFD. The study shows the process of simulating the ship current drag coefficients using the CFD simulations method, which is conducted using STAR-CCM+ software package. The Almirante Padilla class Frigate ship scale model is investigated. The results show the ship current drag coefficient calculated considering a current speed of 1 knot with a 90° drift angle for the full-scale ship. Predicted results were compared against the current drag coefficients published in the Lloyds register OCIMF report. It is shown that the simulation results agree fairly well with the published results and that STAR-CCM+ code can predict current drag coefficients.Keywords: CFD, current draft coefficient, STAR-CCM+, OCIMF, Bollard pull
Procedia PDF Downloads 1762388 Internet Optimization by Negotiating Traffic Times
Authors: Carlos Gonzalez
Abstract:
This paper describes a system to optimize the use of the internet by clients requiring downloading of videos at peak hours. The system consists of a web server belonging to a provider of video contents, a provider of internet communications and a software application running on a client’s computer. The client using the application software will communicate to the video provider a list of the client’s future video demands. The video provider calculates which videos are going to be more in demand for download in the immediate future, and proceeds to request the internet provider the most optimal hours to do the downloading. The times of the downloading will be sent to the application software, which will use the information of pre-established hours negotiated between the video provider and the internet provider to download those videos. The videos will be saved in a special protected section of the user’s hard disk, which will only be accessed by the application software in the client’s computer. When the client is ready to see a video, the application will search the list of current existent videos in the area of the hard disk; if it does exist, it will use this video directly without the need for internet access. We found that the best way to optimize the download traffic of videos is by negotiation between the internet communication provider and the video content provider.Keywords: internet optimization, video download, future demands, secure storage
Procedia PDF Downloads 1372387 FLIME - Fast Low Light Image Enhancement for Real-Time Video
Authors: Vinay P., Srinivas K. S.
Abstract:
Low Light Image Enhancement is of utmost impor- tance in computer vision based tasks. Applications include vision systems for autonomous driving, night vision devices for defence systems, low light object detection tasks. Many of the existing deep learning methods are resource intensive during the inference step and take considerable time for processing. The algorithm should take considerably less than 41 milliseconds in order to process a real-time video feed with 24 frames per second and should be even less for a video with 30 or 60 frames per second. The paper presents a fast and efficient solution which has two main advantages, it has the potential to be used for a real-time video feed, and it can be used in low compute environments because of the lightweight nature. The proposed solution is a pipeline of three steps, the first one is the use of a simple function to map input RGB values to output RGB values, the second is to balance the colors and the final step is to adjust the contrast of the image. Hence a custom dataset is carefully prepared using images taken in low and bright lighting conditions. The preparation of the dataset, the proposed model, the processing time are discussed in detail and the quality of the enhanced images using different methods is shown.Keywords: low light image enhancement, real-time video, computer vision, machine learning
Procedia PDF Downloads 2082386 Fast Adjustable Threshold for Uniform Neural Network Quantization
Authors: Alexander Goncharenko, Andrey Denisov, Sergey Alyamkin, Evgeny Terentev
Abstract:
The neural network quantization is highly desired procedure to perform before running neural networks on mobile devices. Quantization without fine-tuning leads to accuracy drop of the model, whereas commonly used training with quantization is done on the full set of the labeled data and therefore is both time- and resource-consuming. Real life applications require simplification and acceleration of quantization procedure that will maintain accuracy of full-precision neural network, especially for modern mobile neural network architectures like Mobilenet-v1, MobileNet-v2 and MNAS. Here we present a method to significantly optimize training with quantization procedure by introducing the trained scale factors for discretization thresholds that are separate for each filter. Using the proposed technique, we quantize the modern mobile architectures of neural networks with the set of train data of only ∼ 10% of the total ImageNet 2012 sample. Such reduction of train dataset size and small number of trainable parameters allow to fine-tune the network for several hours while maintaining the high accuracy of quantized model (accuracy drop was less than 0.5%). Ready-for-use models and code are available in the GitHub repository.Keywords: distillation, machine learning, neural networks, quantization
Procedia PDF Downloads 3282385 A Novel Breast Cancer Detection Algorithm Using Point Region Growing Segmentation and Pseudo-Zernike Moments
Authors: Aileen F. Wang
Abstract:
Mammography has been one of the most reliable methods for early detection and diagnosis of breast cancer. However, mammography misses about 17% and up to 30% of breast cancers due to the subtle and unstable appearances of breast cancer in their early stages. Recent computer-aided diagnosis (CADx) technology using Zernike moments has improved detection accuracy. However, it has several drawbacks: it uses manual segmentation, Zernike moments are not robust, and it still has a relatively high false negative rate (FNR)–17.6%. This project will focus on the development of a novel breast cancer detection algorithm to automatically segment the breast mass and further reduce FNR. The algorithm consists of automatic segmentation of a single breast mass using Point Region Growing Segmentation, reconstruction of the segmented breast mass using Pseudo-Zernike moments, and classification of the breast mass using the root mean square (RMS). A comparative study among the various algorithms on the segmentation and reconstruction of breast masses was performed on randomly selected mammographic images. The results demonstrated that the newly developed algorithm is the best in terms of accuracy and cost effectiveness. More importantly, the new classifier RMS has the lowest FNR–6%.Keywords: computer aided diagnosis, mammography, point region growing segmentation, pseudo-zernike moments, root mean square
Procedia PDF Downloads 4532384 Digital Privacy Legislation Awareness
Authors: Henry Foulds, Magda Huisman, Gunther R. Drevin
Abstract:
Privacy is regarded as a fundamental human right and it is clear that the study of digital privacy is an important field. Digital privacy is influenced by new and constantly evolving technologies and this continuous change makes it hard to create legislation to protect people’s privacy from being exploited by misuse of these technologies.
This study aims to benefit digital privacy legislation efforts by evaluating the awareness and perceived importance of digital privacy legislation among computer science students. The chosen fixed variables for the population are study year and gamer classification.
The use of location based services in mobile applications and games are a concern for digital privacy. For this reason the study focused on computer science students as they have a high likelihood to use and develop this type of software. Surveys were used to evaluate awareness and perceived importance of digital privacy legislation.
The results of the study show that privacy legislation and awareness of privacy legislation are important to people. The perception of the importance of privacy legislation increases with academic experience. Awareness of privacy legislation increases from non-gamers to pro gamers.
Keywords: digital privacy, legislation awareness, gaming, privacy legislation
Procedia PDF Downloads 3562383 Analysis of Thermal Comfort in Educational Buildings Using Computer Simulation: A Case Study in Federal University of Parana, Brazil
Authors: Ana Julia C. Kfouri
Abstract:
A prerequisite of any building design is to provide security to the users, taking the climate and its physical and physical-geometrical variables into account. It is also important to highlight the relevance of the right material elements, which arise between the person and the agent, and must provide improved thermal comfort conditions and low environmental impact. Furthermore, technology is constantly advancing, as well as computational simulations for projects, and they should be used to develop sustainable building and to provide higher quality of life for its users. In relation to comfort, the more satisfied the building users are, the better their intellectual performance will be. Based on that, the study of thermal comfort in educational buildings is of relative relevance, since the thermal characteristics in these environments are of vital importance to all users. Moreover, educational buildings are large constructions and when they are poorly planned and executed they have negative impacts to the surrounding environment, as well as to the user satisfaction, throughout its whole life cycle. In this line of thought, to evaluate university classroom conditions, it was accomplished a detailed case study on the thermal comfort situation at Federal University of Parana (UFPR). The main goal of the study is to perform a thermal analysis in three classrooms at UFPR, in order to address the subjective and physical variables that influence thermal comfort inside the classroom. For the assessment of the subjective components, a questionnaire was applied in order to evaluate the reference for the local thermal conditions. Regarding the physical variables, it was carried out on-site measurements, which consist of performing measurements of air temperature and air humidity, both inside and outside the building, as well as meteorological variables, such as wind speed and direction, solar radiation and rainfall, collected from a weather station. Then, a computer simulation based on results from the EnergyPlus software to reproduce air temperature and air humidity values of the three classrooms studied was conducted. The EnergyPlus outputs were analyzed and compared with the on-site measurement results to be possible to come out with a conclusion related to the local thermal conditions. The methodological approach included in the study allowed a distinct perspective in an educational building to better understand the classroom thermal performance, as well as the reason of such behavior. Finally, the study induces a reflection about the importance of thermal comfort for educational buildings and propose thermal alternatives for future projects, as well as a discussion about the significant impact of using computer simulation on engineering solutions, in order to improve the thermal performance of UFPR’s buildings.Keywords: computer simulation, educational buildings, EnergyPlus, humidity, temperature, thermal comfort
Procedia PDF Downloads 3882382 A Case Study Report on Acoustic Impact Assessment and Mitigation of the Hyprob Research Plant
Authors: D. Bianco, A. Sollazzo, M. Barbarino, G. Elia, A. Smoraldi, N. Favaloro
Abstract:
The activities, described in the present paper, have been conducted in the framework of the HYPROB-New Program, carried out by the Italian Aerospace Research Centre (CIRA) promoted and funded by the Italian Ministry of University and Research (MIUR) in order to improve the National background on rocket engine systems for space applications. The Program has the strategic objective to improve National system and technology capabilities in the field of liquid rocket engines (LRE) for future Space Propulsion Systems applications, with specific regard to LOX/LCH4 technology. The main purpose of the HYPROB program is to design and build a Propulsion Test Facility (HIMP) allowing test activities on Liquid Thrusters. The development of skills in liquid rocket propulsion can only pass through extensive test campaign. Following its mission, CIRA has planned the development of new testing facilities and infrastructures for space propulsion characterized by adequate sizes and instrumentation. The IMP test cell is devoted to testing articles representative of small combustion chambers, fed with oxygen and methane, both in liquid and gaseous phase. This article describes the activities that have been carried out for the evaluation of the acoustic impact, and its consequent mitigation. The impact of the simulated acoustic disturbance has been evaluated, first, using an approximated method based on experimental data by Baumann and Coney, included in “Noise and Vibration Control Engineering” edited by Vér and Beranek. This methodology, used to evaluate the free-field radiation of jet in ideal acoustical medium, analyzes in details the jet noise and assumes sources acting at the same time. It considers as principal radiation sources the jet mixing noise, caused by the turbulent mixing of jet gas and the ambient medium. Empirical models, allowing a direct calculation of the Sound Pressure Level, are commonly used for rocket noise simulation. The model named after K. Eldred is probably one of the most exploited in this area. In this paper, an improvement of the Eldred Standard model has been used for a detailed investigation of the acoustical impact of the Hyprob facility. This new formulation contains an explicit expression for the acoustic pressure of each equivalent noise source, in terms of amplitude and phase, allowing the investigation of the sources correlation effects and their propagation through wave equations. In order to enhance the evaluation of the facility acoustic impact, including an assessment of the mitigation strategies to be set in place, a more advanced simulation campaign has been conducted using both an in-house code for noise propagation and scattering, and a commercial code for industrial noise environmental impact, CadnaA. The noise prediction obtained with the revised Eldred-based model has then been used for formulating an empirical/BEM (Boundary Element Method) hybrid approach allowing the evaluation of the barrier mitigation effect, at the design. This approach has been compared with the analogous empirical/ray-acoustics approach, implemented within CadnaA using a customized definition of sources and directivity factor. The resulting impact evaluation study is reported here, along with the design-level barrier optimization for noise mitigation.Keywords: acoustic impact, industrial noise, mitigation, rocket noise
Procedia PDF Downloads 1482381 Blogging vs Paper-and-Pencil Writing: Evidences from an Iranian Academic L2 Setting
Authors: Mehran Memari, Bita Asadi
Abstract:
Second language (L2) classrooms in academic contexts usually consist of learners with diverse L2 proficiency levels. One solution for managing such heterogeneous classes and addressing individual needs of students is to improve learner autonomy by using technological innovations such as blogging. The focus of this study is on investigating the effects of blogging on improving the quality of Iranian university students' writings. For this aim, twenty-six Iranian university students participated in the study. Students in the experimental group (n=13) were required to blog daily while the students in the control group (n=13) were asked to write a daily schedule using paper and pencil. After a 3-month period of instruction, the five last writings of the students from both groups were rated by two experienced raters. Also, students' attitudes toward the traditional method and blogging were surveyed using a questionnaire and a semi-structured interview. The research results showed evidences in favor of the students who used blogging in their writing program. Also, although students in the experimental group found blogging more demanding than the traditional method, they showed an overall positive attitude toward the use of blogging as a way of improving their writing skills. The findings of the study have implications for the incorporation of computer-assisted learning in L2 academic contexts.Keywords: blogging, computer-assisted learning, paper and pencil, writing
Procedia PDF Downloads 4032380 Investigating Teachers’ Perceptions about the Use of Technology in Second Language Learning at Universities in Pakistan
Authors: Nadir Ali Mugheri
Abstract:
This study has explored the perceptions of English language teachers (ELT) regarding use of technology in learning English as a second language (L2) at Universities in Pakistan. In this regard, 200 ELT teachers from 80 leading universities were selected through a judgmental sampling method. Results established that most of the teachers supported integration and incorporation of technology in the language classroom so as to teach L2 in an effective and efficient way. This study unearthed that the teachers termed the use of technology in learning English as a second language (ESL) as a positive step towards enhancing the learning capabilities and improving the personal traits of the students or learners. Findings suggest that the integration of technology in the language learning makes the learners within the classroom active and enthusiastic, and the teachers need to be equipped with the latest knowledge of mobile assisted language learning (MALL) and computer assisted language learning (CALL) so that they may ensure use of this innovative technology in their teaching practices. Results also indicated that the technology has proved itself a stimulus for improving language in the ELT milieu. The use of technology helps teachers develop themselves professionally. This study discovered that there are many determinants that make teaching and learning within the classroom efficacious, while the use of technology is one of them. Data was collected through qualitative design in order to get a complete depiction. Semi-structured interviews were conducted and analyzed through thematic analysis.Keywords: english language teaching, computer assisted language learning, use of technology, thematic analysis
Procedia PDF Downloads 712379 Component Interface Formalization in Robotic Systems
Authors: Anton Hristozov, Eric Matson, Eric Dietz, Marcus Rogers
Abstract:
Components are heavily used in many software systems, including robotics systems. The growth of sophistication and diversity of new capabilities for robotic systems presents new challenges to their architectures. Their complexity is growing exponentially with the advent of AI, smart sensors, and the complex tasks they have to accomplish. Such complexity requires a more rigorous approach to the creation, use, and interoperability of software components. The issue is exacerbated because robotic systems are becoming more and more reliant on third-party components for certain functions. In order to achieve this kind of interoperability, including dynamic component replacement, we need a way to standardize their interfaces. A formal approach is desperately needed to specify what an interface of a robotic software component should contain. This study performs an analysis of the issue and presents a universal and generic approach to standardizing component interfaces for robotic systems. Our approach is inspired by well-established robotic architectures such as ROS, PX4, and Ardupilot. The study is also applicable to other software systems that share similar characteristics with robotic systems. We consider the use of JSON or Domain Specific Languages (DSL) development with tools such as Antlr and automatic code and configuration file generation for frameworks such as ROS and PX4. A case study with ROS2 is presented as a proof of concept for the proposed methodology.Keywords: CPS, robots, software architecture, interface, ROS, autopilot
Procedia PDF Downloads 922378 Influence of Shield Positions on Thermo/Fluid Performance of Pin Fin Heat Sink
Authors: Ramy H. Mohammed
Abstract:
In heat sinks, the flow within the core exhibits separation and hence does not lend itself to simple analytical boundary layer or duct flow analysis of the wall friction. In this paper, I present some findings from an experimental and numerical study aimed to obtain physical insight into the influence of the presence of the shield and its position on the hydraulic and thermal performance of square pin fin heat sink without top by-pass. The variations of the Nusselt number and friction factor are obtained under varied parameters, such as the Reynolds number and the shield position. The numerical code is validated by comparing the numerical results with the available experimental data. It is shown that, there is a good agreement between the temperature predictions based on the model and the experimental data. Results show that, as the presence of the shield, the heat transfer of fin array is enhanced and the flow resistance increased. The surface temperature distribution of the heat sink base is more uniform when the dimensionless shield position equals to 1/3 or 2/3. The comprehensive performance evaluation approach based on identical pumping power criteria is adopted and shows that the optimum shield position is at x/l=0.43 where energy is saved.Keywords: shield, fin array, performance evaluation, heat transfer, energy
Procedia PDF Downloads 3072377 A Study on Golden Ratio (ф) and Its Implications on Seismic Design Using ETABS
Authors: Vishal A. S. Salelkar, Sumitra S. Kandolkar
Abstract:
Golden ratio (ф) or Golden mean or Golden section, as it is often referred to, is a proportion or a mean, which is often used by architects while conceiving the aesthetics of a structure. Golden Ratio (ф) is an irrational number that can be roughly rounded to 1.618 and is derived out of quadratic equation x2-x-1=0. The use of Golden Ratio (ф) can be observed throughout history, as far as ancient Egyptians, which later peaked during the Greek golden age. The use of this design technique is very much prevalent. At present, architects around the world prefer this as one of the primary techniques to decide aesthetics. In this study, an analysis has been performed to investigate whether the use of the golden ratio while planning a structure has any effects on the seismic behavior of the structure. The structure is modeled and analyzed on ETABS (by Computers and Structures, Inc.) for Seismic requirements equivalent to Zone III (Region: Goa-India) as per Indian Standard Code IS-1893. The results were compared to that of an identical structure modeled along the lines of normal design philosophy, not using the Golden Ratio tools. The results were then compared for Story Shear, Story Drift, and Story Displacement Readings. Improvement in performance, although slight, but was observed. Similar improvements were also observed in subsequent iterations, performed using time-acceleration data of previous major earthquakes matched to Zone III as per IS-1893.Keywords: ETABS, golden ratio, seismic design, structural behavior
Procedia PDF Downloads 1842376 Multi-Agent System for Irrigation Using Fuzzy Logic Algorithm and Open Platform Communication Data Access
Authors: T. Wanyama, B. Far
Abstract:
Automatic irrigation systems usually conveniently protect landscape investment. While conventional irrigation systems are known to be inefficient, automated ones have the potential to optimize water usage. In fact, there is a new generation of irrigation systems that are smart in the sense that they monitor the weather, soil conditions, evaporation and plant water use, and automatically adjust the irrigation schedule. In this paper, we present an agent based smart irrigation system. The agents are built using a mix of commercial off the shelf software, including MATLAB, Microsoft Excel and KEPServer Ex5 OPC server, and custom written code. The Irrigation Scheduler Agent uses fuzzy logic to integrate the information that affect the irrigation schedule. In addition, the Multi-Agent system uses Open Platform Connectivity (OPC) technology to share data. OPC technology enables the Irrigation Scheduler Agent to communicate over the Internet, making the system scalable to a municipal or regional agent based water monitoring, management, and optimization system. Finally, this paper presents simulation and pilot installation test result that show the operational effectiveness of our system.Keywords: community water usage, fuzzy logic, irrigation, multi-agent system
Procedia PDF Downloads 2982375 A Comprehensive Approach to Mitigate Return-Oriented Programming Attacks: Combining Operating System Protection Mechanisms and Hardware-Assisted Techniques
Authors: Zhang Xingnan, Huang Jingjia, Feng Yue, Burra Venkata Durga Kumar
Abstract:
This paper proposes a comprehensive approach to mitigate ROP (Return-Oriented Programming) attacks by combining internal operating system protection mechanisms and hardware-assisted techniques. Through extensive literature review, we identify the effectiveness of ASLR (Address Space Layout Randomization) and LBR (Last Branch Record) in preventing ROP attacks. We present a process involving buffer overflow detection, hardware-assisted ROP attack detection, and the use of Turing detection technology to monitor control flow behavior. We envision a specialized tool that views and analyzes the last branch record, compares control flow with a baseline, and outputs differences in natural language. This tool offers a graphical interface, facilitating the prevention and detection of ROP attacks. The proposed approach and tool provide practical solutions for enhancing software security.Keywords: operating system, ROP attacks, returning-oriented programming attacks, ASLR, LBR, CFI, DEP, code randomization, hardware-assisted CFI
Procedia PDF Downloads 952374 Analysis of the Interventions Performed in Pediatric Cardiology Unit Based on Nursing Interventions Classification (NIC-6th): A Pilot Study
Authors: Ji Wen Sun, Nan Ping Shen, Yi Bei Wu
Abstract:
This study used Nursing Interventions Classification (NIC-6th) to identify the interventions performed in a pediatric cardiology unit, and then to analysis its frequency, time and difficulty, so as to give a brief review on what our nurses have done. The research team selected a 35 beds pediatric cardiology unit, and drawn all the nursing interventions in the nursing record from our hospital information system (HIS) from 1 October 2015 to 30 November 2015, using NIC-6th to do the matching and then counting their frequencies. Then giving each intervention its own time and difficulty code according to NIC-6th. The results showed that nurses in pediatric cardiology unit performed totally 43 interventions from 5394 statements, and most of them were in RN(basic) education level needed and less than 15 minutes time needed. There still had some interventions just needed by a nursing assistant but done by nurses, which should call for nurse managers to think about the suitable staffing. Thus, counting the summary of the product of frequency, time and difficulty for each intervention of each nurse can know one's performance. Acknowledgement Clinical Management Optimization Project of Shanghai Shen Kang Hospital Development Center (SHDC2014615); Hundred-Talent Program of Construction of Nursing Plateau Discipline (hlgy16073qnhb).Keywords: nursing interventions, nursing interventions classification, nursing record, pediatric cardiology
Procedia PDF Downloads 365