Search results for: real time model
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 32107

Search results for: real time model

18637 Bidirectional Long Short-Term Memory-Based Signal Detection for Orthogonal Frequency Division Multiplexing With All Index Modulation

Authors: Mahmut Yildirim

Abstract:

This paper proposed the bidirectional long short-term memory (Bi-LSTM) network-aided deep learning (DL)-based signal detection for Orthogonal frequency division multiplexing with all index modulation (OFDM-AIM), namely Bi-DeepAIM. OFDM-AIM is developed to increase the spectral efficiency of OFDM with index modulation (OFDM-IM), a promising multi-carrier technique for communication systems beyond 5G. In this paper, due to its strong classification ability, Bi-LSTM is considered an alternative to the maximum likelihood (ML) algorithm, which is used for signal detection in the classical OFDM-AIM scheme. The performance of the Bi-DeepAIM is compared with LSTM network-aided DL-based OFDM-AIM (DeepAIM) and classic OFDM-AIM that uses (ML)-based signal detection via BER performance and computational time criteria. Simulation results show that Bi-DeepAIM obtains better bit error rate (BER) performance than DeepAIM and lower computation time in signal detection than ML-AIM.

Keywords: bidirectional long short-term memory, deep learning, maximum likelihood, OFDM with all index modulation, signal detection

Procedia PDF Downloads 62
18636 Modeling and Simulation of Textile Effluent Treatment Using Ultrafiltration Membrane Technology

Authors: Samia Rabet, Rachida Chemini, Gerhard Schäfer, Farid Aiouache

Abstract:

The textile industry generates large quantities of wastewater, which poses significant environmental problems due to its complex composition and high levels of pollutants loaded principally with heavy metals, large amounts of COD, and dye. Separation treatment methods are often known for their effectiveness in removing contaminants whereas membrane separation techniques are a promising process for the treatment of textile effluent due to their versatility, efficiency, and low energy requirements. This study focuses on the modeling and simulation of membrane separation technologies with a cross-flow filtration process for textile effluent treatment. It aims to explore the application of mathematical models and computational simulations using ASPEN Plus Software in the prediction of a complex and real effluent separation. The results demonstrate the effectiveness of modeling and simulation techniques in predicting pollutant removal efficiencies with a global deviation percentage of 1.83% between experimental and simulated results; membrane fouling behavior, and overall process performance (hydraulic resistance, membrane porosity) were also estimated and indicating that the membrane losses 10% of its efficiency after 40 min of working.

Keywords: membrane separation, ultrafiltration, textile effluent, modeling, simulation

Procedia PDF Downloads 48
18635 Model of Community Management for Sustainable Utilization

Authors: Luedech Girdwichai, Withaya Mekhum

Abstract:

This research intended to develop the model of community management for sustainable utilization by investigating on 2 groups of population, the family heads and the community management team. The population of the former group consisted of family heads from 511 families in 12 areas to complete the questionnaires which were returned at 479 sets. The latter group consisted of the community management team of 12 areas with 1 representative from each area to give the interview. The questionnaires for the family heads consisted of 2 main parts; general information such as occupations, etc. in the form of checklist. The second part dealt with the data on self reliance community development based on 4P Framework, i.e., People (human resource) development, Place (area) development, Product (economic and income source) development, and Plan (community plan) development in the form of rating scales. Data in the 1st part were calculated to find frequency and percentage while those in the 2nd part were analyzed to find arithmetic mean and SD. Data from the 2nd group of population or the community management team were derived from focus group to find factors influencing successful management together with the in depth interview which were analyzed by descriptive statistics. The results showed that 479 family heads reported that the aspect on the implementation of community plan to self reliance community activities based on Sufficient Economy Philosophy and the 4P was at the average of 3.28 or moderate level. When considering in details, it was found that the 1st aspect was on the area development with the mean of 3.71 or high level followed by human resource development with the mean of 3.44 or moderate level, then, economic and source of income development with the mean of 3.09 or moderate level. The last aspect was community plan development with the mean of 2.89. The results from the small group discussion revealed some factors and guidelines for successful community management as follows: 1) on the People (human resource) development aspect, there was a project to support and develop community leaders. 2) On the aspect of Place (area) development, there was a development on conservative tourism areas. 3) On the aspect of Product (economic and source of income) development, the community leaders promoted the setting of occupational group, saving group, and product processing group. 4) On the aspect of Plan (community plan) development, there was a prioritization through public hearing.

Keywords: model of community management, sustainable utilization, family heads, community management team

Procedia PDF Downloads 332
18634 Current Situation of Maritime Transport and Logistics in Myanmar

Authors: S. N. S. Thein, H. L. Yang, Z. B. Liu

Abstract:

There are many modes of transport. Among them, maritime transport is a major transportation mode of international trade. In the Republic of the Union of Myanmar (Burma), water transportation served as one of the most important modes of transport for country's exports and imports. Getting the accurate information and data-gathering activity are the most important aspects for any study field. Therefore, in this research, a historical review of the development of ports in Myanmar and how they have changed had been carried out. All the relevant literature and documents have also been reviewed, studied, and organized. The sources of collected data are from reports, journals, internet, as well as from the publications of authorized organizations and international associations. To get better understanding about real situation of maritime transport and logistics in Myanmar; current condition of existing ports, expansion and on-going projects, and future port development plans are described successively. Hence, the main purpose of this study is to build up a comprehensive picture of maritime transport and logistics, in addition to border trade within ASEAN and Myanmar. It will help for academic researchers, decision makers, and stakeholders for national planning as well as for the local and foreign investors to recognize current situation of maritime transport and logistics in Myanmar.

Keywords: ASEAN, border trade, logistics, maritime transport, ports of Myanmar

Procedia PDF Downloads 216
18633 An Authentic Algorithm for Ciphering and Deciphering Called Latin Djokovic

Authors: Diogen Babuc

Abstract:

The question that is a motivation of writing is how many devote themselves to discovering something in the world of science where much is discerned and revealed, but at the same time, much is unknown. Methods: The insightful elements of this algorithm are the ciphering and deciphering algorithms of Playfair, Caesar, and Vigenère. Only a few of their main properties are taken and modified, with the aim of forming a specific functionality of the algorithm called Latin Djokovic. Specifically, a string is entered as input data. A key k is given, with a random value between the values a and b = a+3. The obtained value is stored in a variable with the aim of being constant during the run of the algorithm. In correlation to the given key, the string is divided into several groups of substrings, and each substring has a length of k characters. The next step involves encoding each substring from the list of existing substrings. Encoding is performed using the basis of Caesar algorithm, i.e., shifting with k characters. However, that k is incremented by 1 when moving to the next substring in that list. When the value of k becomes greater than b+1, it’ll return to its initial value. The algorithm is executed, following the same procedure, until the last substring in the list is traversed. Results: Using this polyalphabetic method, ciphering and deciphering of strings are achieved. The algorithm also works for a 100-character string. The x character isn’t used when the number of characters in a substring is incompatible with the expected length. The algorithm is simple to implement, but it’s questionable if it works better than the other methods from the point of view of execution time and storage space.

Keywords: ciphering, deciphering, authentic, algorithm, polyalphabetic cipher, random key, methods comparison

Procedia PDF Downloads 98
18632 Experimental Study of the Fiber Dispersion of Pulp Liquid Flow in Channels with Application to Papermaking

Authors: Masaru Sumida

Abstract:

This study explored the feasibility of improving the hydraulic headbox of papermaking machines by studying the flow of wood-pulp suspensions behind a flat plate inserted in parallel and convergent channels. Pulp fiber concentrations of the wake downstream of the plate were investigated by flow visualization and optical measurements. Changes in the time-averaged and fluctuation of the fiber concentration along the flow direction were examined. In addition, the control of the flow characteristics in the two channels was investigated. The behaviors of the pulp fibers and the wake flow were found to be strongly related to the flow states in the upstream passages partitioned by the plate. The distribution of the fiber concentration was complex because of the formation of a thin water layer on the plate and the generation of Karman’s vortices at the trailing edge of the plate. Compared with the flow in the parallel channel, fluctuations in the fiber concentration decreased in the convergent channel. However, at low flow velocities, the convergent channel has a weak effect on equilibrating the time-averaged fiber concentration. This shows that a rectangular trailing edge cannot adequately disperse pulp suspensions; thus, at low flow velocities, a convergent channel is ineffective in ensuring uniform fiber concentration.

Keywords: fiber dispersion, headbox, pulp liquid, wake flow

Procedia PDF Downloads 376
18631 Leça da Palmeira Revisited: Sixty-Seven Years of Recurring Work by Álvaro Siza

Authors: Eduardo Jorge Cabral dos Santos Fernandes

Abstract:

Over the last sixty-seven years, Portuguese architect Álvaro Siza Vieira designed several interventions for the Leça da Palmeira waterfront. With this paper, we aim to analyze the history of this set of projects in a chronological approach, seeking to understand the connections that can be established between them. Born in Matosinhos, a fishing and industrial village located near Porto, Álvaro Siza built a remarkable relationship with Leça da Palmeira (a neighboring village located to the north) from a personal and professional point of view throughout his life: it was there that he got married (in the small chapel located next to the Boa Nova lighthouse) and it was there that he designed his first works of great impact, the Boa Nova Tea House and the Ocean Swimming Pool, today classified as national monuments. These two works were the subject of several projects spaced over time, including recent restoration interventions designed by the same author. However, the marks of Siza's intervention in this territory are not limited to these two cases; there were other projects designed for this territory, which we also intend to analyze: the monument to the poet António Nobre (1967-80), the unbuilt project for a restaurant next to Piscina das Marés (presented in 1966 and redesigned in 1993), the reorganization of the Avenida da Liberdade (with a first project, not carried out, in 1965-74, and a reformulation carried out between 1998 and 2006) and, finally, the project for the new APDL facilities, which completes Avenida da Liberdade to the south (1995). Altogether, these interventions are so striking in this territory, from a landscape, formal, functional, and tectonic point of view, that it is difficult to imagine this waterfront without their presence. In all cases, the relationship with the site explains many of the design options. Time after time, the conditions of the pre-existing territory (also affected by the previous interventions of Siza) were considered, so each project created a new circumstance, conditioning the following interventions. This paper is part of a more comprehensive project, which aims to analyze the work of Álvaro Siza in its fundamental relationship with the site.

Keywords: Álvaro Siza, contextualism, Leça da Palmeira, landscape

Procedia PDF Downloads 19
18630 Big Data and Health: An Australian Perspective Which Highlights the Importance of Data Linkage to Support Health Research at a National Level

Authors: James Semmens, James Boyd, Anna Ferrante, Katrina Spilsbury, Sean Randall, Adrian Brown

Abstract:

‘Big data’ is a relatively new concept that describes data so large and complex that it exceeds the storage or computing capacity of most systems to perform timely and accurate analyses. Health services generate large amounts of data from a wide variety of sources such as administrative records, electronic health records, health insurance claims, and even smart phone health applications. Health data is viewed in Australia and internationally as highly sensitive. Strict ethical requirements must be met for the use of health data to support health research. These requirements differ markedly from those imposed on data use from industry or other government sectors and may have the impact of reducing the capacity of health data to be incorporated into the real time demands of the Big Data environment. This ‘big data revolution’ is increasingly supported by national governments, who have invested significant funds into initiatives designed to develop and capitalize on big data and methods for data integration using record linkage. The benefits to health following research using linked administrative data are recognised internationally and by the Australian Government through the National Collaborative Research Infrastructure Strategy Roadmap, which outlined a multi-million dollar investment strategy to develop national record linkage capabilities. This led to the establishment of the Population Health Research Network (PHRN) to coordinate and champion this initiative. The purpose of the PHRN was to establish record linkage units in all Australian states, to support the implementation of secure data delivery and remote access laboratories for researchers, and to develop the Centre for Data Linkage for the linkage of national and cross-jurisdictional data. The Centre for Data Linkage has been established within Curtin University in Western Australia; it provides essential record linkage infrastructure necessary for large-scale, cross-jurisdictional linkage of health related data in Australia and uses a best practice ‘separation principle’ to support data privacy and security. Privacy preserving record linkage technology is also being developed to link records without the use of names to overcome important legal and privacy constraint. This paper will present the findings of the first ‘Proof of Concept’ project selected to demonstrate the effectiveness of increased record linkage capacity in supporting nationally significant health research. This project explored how cross-jurisdictional linkage can inform the nature and extent of cross-border hospital use and hospital-related deaths. The technical challenges associated with national record linkage, and the extent of cross-border population movements, were explored as part of this pioneering research project. Access to person-level data linked across jurisdictions identified geographical hot spots of cross border hospital use and hospital-related deaths in Australia. This has implications for planning of health service delivery and for longitudinal follow-up studies, particularly those involving mobile populations.

Keywords: data integration, data linkage, health planning, health services research

Procedia PDF Downloads 214
18629 Spectrophotometric Determination of Photohydroxylated Products of Humic Acid in the Presence of Salicylate Probe

Authors: Julide Hizal Yucesoy, Batuhan Yardimci, Aysem Arda, Resat Apak

Abstract:

Humic substances produce reactive oxygene species such as hydroxyl, phenoxy and superoxide radicals by oxidizing in a wide pH and reduction potential range. Hydroxyl radicals, produced by reducing agents such as antioxidants and/or peroxides, attack on salicylate probe, and form 2,3-dihydroxybenzoate, 2,4-dihydroxybenzoate and 2,5-dihydroxybenzoate species. These species are quantitatively determined by using HPLC Method. Humic substances undergo photodegradation by UV radiation. As a result of their antioxidant properties, they produce hydroxyl radicals. In the presence of salicylate probe, these hydroxyl radicals react with salicylate molecules to form hydroxylated products (dihidroxybenzoate isomers). In this study, humic acid was photodegraded in a photoreactor at 254 nm (400W), formed hydroxyl radicals were caught by salicylate probe. The total concentration of hydroxylated salicylate species was measured by using spectrophotometric CUPRAC Method. And also, using results of time dependent experiments, kinetic of photohydroxylation was determined at different pHs. This method has been applied for the first time to measure the concentration of hydroxylated products. It allows to achieve the results easier than HPLC Method.

Keywords: CUPRAC method, humic acid, photohydroxylation, salicylate probe

Procedia PDF Downloads 199
18628 The Transformation of Architecture through the Technological Developments in History: Future Architecture Scenario

Authors: Adel Gurel, Ozge Ceylin Yildirim

Abstract:

Nowadays, design and architecture are being affected and underwent change with the rapid advancements in technology, economics, politics, society and culture. Architecture has been transforming with the latest developments after the inclusion of computers into design. Integration of design into the computational environment has revolutionized the architecture and new perspectives in architecture have been gained. The history of architecture shows the various technological developments and changes in which the architecture has transformed with time. Therefore, the analysis of integration between technology and the history of the architectural process makes it possible to build a consensus on the idea of how architecture is to proceed. In this study, each period that occurs with the integration of technology into architecture is addressed within historical process. At the same time, changes in architecture via technology are identified as important milestones and predictions with regards to the future of architecture have been determined. Developments and changes in technology and the use of technology in architecture within years are analyzed in charts and graphs comparatively. The historical process of architecture and its transformation via technology are supported with detailed literature review and they are consolidated with the examination of focal points of 20th-century architecture under the titles; parametric design, genetic architecture, simulation, and biomimicry. It is concluded that with the historical research between past and present; the developments in architecture cannot keep up with the advancements in technology and recent developments in technology overshadow the architecture, even the technology decides the direction of architecture. As a result, a scenario is presented with regards to the reach of technology in the future of architecture and the role of the architect.

Keywords: computer technologies, future architecture, scientific developments, transformation

Procedia PDF Downloads 181
18627 Understanding Evolutionary Algorithms through Interactive Graphical Applications

Authors: Javier Barrachina, Piedad Garrido, Manuel Fogue, Julio A. Sanguesa, Francisco J. Martinez

Abstract:

It is very common to observe, especially in Computer Science studies that students have difficulties to correctly understand how some mechanisms based on Artificial Intelligence work. In addition, the scope and limitations of most of these mechanisms are usually presented by professors only in a theoretical way, which does not help students to understand them adequately. In this work, we focus on the problems found when teaching Evolutionary Algorithms (EAs), which imitate the principles of natural evolution, as a method to solve parameter optimization problems. Although this kind of algorithms can be very powerful to solve relatively complex problems, students often have difficulties to understand how they work, and how to apply them to solve problems in real cases. In this paper, we present two interactive graphical applications which have been specially designed with the aim of making Evolutionary Algorithms easy to be understood by students. Specifically, we present: (i) TSPS, an application able to solve the ”Traveling Salesman Problem”, and (ii) FotEvol, an application able to reconstruct a given image by using Evolution Strategies. The main objective is that students learn how these techniques can be implemented, and the great possibilities they offer.

Keywords: education, evolutionary algorithms, evolution strategies, interactive learning applications

Procedia PDF Downloads 330
18626 Supercomputer Simulation of Magnetic Multilayers Films

Authors: Vitalii Yu. Kapitan, Aleksandr V. Perzhu, Konstantin V. Nefedev

Abstract:

The necessity of studying magnetic multilayer structures is explained by the prospects of their practical application as a technological base for creating new storages medium. Magnetic multilayer films have many unique features that contribute to increasing the density of information recording and the speed of storage devices. Multilayer structures are structures of alternating magnetic and nonmagnetic layers. In frame of the classical Heisenberg model, lattice spin systems with direct short- and long-range exchange interactions were investigated by Monte Carlo methods. The thermodynamic characteristics of multilayer structures, such as the temperature behavior of magnetization, energy, and heat capacity, were investigated. The processes of magnetization reversal of multilayer structures in external magnetic fields were investigated. The developed software is based on the new, promising programming language Rust. Rust is a new experimental programming language developed by Mozilla. The language is positioned as an alternative to C and C++. For the Monte Carlo simulation, the Metropolis algorithm and its parallel implementation using MPI and the Wang-Landau algorithm were used. We are planning to study of magnetic multilayer films with asymmetric Dzyaloshinskii–Moriya (DM) interaction, interfacing effects and skyrmions textures. This work was supported by the state task of the Ministry of Education and Science of the Russia # 3.7383.2017/8.9

Keywords: The Monte Carlo methods, Heisenberg model, multilayer structures, magnetic skyrmion

Procedia PDF Downloads 163
18625 Recognizing Customer Preferences Using Review Documents: A Hybrid Text and Data Mining Approach

Authors: Oshin Anand, Atanu Rakshit

Abstract:

The vast increment in the e-commerce ventures makes this area a prominent research stream. Besides several quantified parameters, the textual content of reviews is a storehouse of many information that can educate companies and help them earn profit. This study is an attempt in this direction. The article attempts to categorize data based on a computed metric that quantifies the influencing capacity of reviews rendering two categories of high and low influential reviews. Further, each of these document is studied to conclude several product feature categories. Each of these categories along with the computed metric is converted to linguistic identifiers and are used in an association mining model. The article makes a novel attempt to combine feature attraction with quantified metric to categorize review text and finally provide frequent patterns that depict customer preferences. Frequent mentions in a highly influential score depict customer likes or preferred features in the product whereas prominent pattern in low influencing reviews highlights what is not important for customers. This is achieved using a hybrid approach of text mining for feature and term extraction, sentiment analysis, multicriteria decision-making technique and association mining model.

Keywords: association mining, customer preference, frequent pattern, online reviews, text mining

Procedia PDF Downloads 384
18624 Artificial Neurons Based on Memristors for Spiking Neural Networks

Authors: Yan Yu, Wang Yu, Chen Xintong, Liu Yi, Zhang Yanzhong, Wang Yanji, Chen Xingyu, Zhang Miaocheng, Tong Yi

Abstract:

Neuromorphic computing based on spiking neural networks (SNNs) has emerged as a promising avenue for building the next generation of intelligent computing systems. Owing to its high-density integration, low power, and outstanding nonlinearity, memristors have attracted emerging attention on achieving SNNs. However, fabricating a low-power and robust memristor-based spiking neuron without extra electrical components is still a challenge for brain-inspired systems. In this work, we demonstrate a TiO₂-based threshold switching (TS) memristor to emulate a leaky integrate-and-fire (LIF) neuron without auxiliary circuits, used to realize single layer fully connected (FC) SNNs. Moreover, our TiO₂-based resistive switching (RS) memristors realize spiking-time-dependent-plasticity (STDP), originating from the Ag diffusion-based filamentary mechanism. This work demonstrates that TiO2-based memristors may provide an efficient method to construct hardware neuromorphic computing systems.

Keywords: leaky integrate-and-fire, memristor, spiking neural networks, spiking-time-dependent-plasticity

Procedia PDF Downloads 126
18623 An Analysis of Non-Elliptic Curve Based Primality Tests

Authors: William Wong, Zakaria Alomari, Hon Ching Lai, Zhida Li

Abstract:

Modern-day information security depends on implementing Diffie-Hellman, which requires the generation of prime numbers. Because the number of primes is infinite, it is impractical to store prime numbers for use, and therefore, primality tests are indispensable in modern-day information security. A primality test is a test to determine whether a number is prime or composite. There are two types of primality tests, which are deterministic tests and probabilistic tests. Deterministic tests are adopting algorithms that provide a definite answer whether a given number is prime or composite. While in probabilistic tests, a probabilistic result would be provided, there is a degree of uncertainty. In this paper, we review three probabilistic tests: the Fermat Primality Test, the Miller-Rabin Test, and the Baillie-PSW Test, as well as one deterministic test, the Agrawal-Kayal-Saxena (AKS) Test. Furthermore, we do an analysis of these tests. All of the reviews discussed are not based on the Elliptic Curve. The analysis demonstrates that, in the majority of real-world scenarios, the Baillie- PSW test’s favorability stems from its typical operational complexity of O(log 3n) and its capacity to deliver accurate results for numbers below 2^64.

Keywords: primality tests, Fermat’s primality test, Miller-Rabin primality test, Baillie-PSW primality test, AKS primality test

Procedia PDF Downloads 82
18622 Three-Dimensional CFD Modeling of Flow Field and Scouring around Bridge Piers

Authors: P. Deepak Kumar, P. R. Maiti

Abstract:

In recent years, sediment scour near bridge piers and abutment is a serious problem which causes nationwide concern because it has resulted in more bridge failures than other causes. Scour is the formation of scour hole around the structure mounted on and embedded in erodible channel bed due to the erosion of soil by flowing water. The formation of scour hole around the structures depends upon shape and size of the pier, depth of flow as well as angle of attack of flow and sediment characteristics. The flow characteristics around these structures change due to man-made obstruction in the natural flow path which changes the kinetic energy of the flow around these structures. Excessive scour affects the stability of the foundation of the structure by the removal of the bed material. The accurate estimation of scour depth around bridge pier is very difficult. The foundation of bridge piers have to be taken deeper and to provide sufficient anchorage length required for stability of the foundation. In this study, computational model simulations using a 3D Computational Fluid Dynamics (CFD) model were conducted to examine the mechanism of scour around a cylindrical pier. Subsequently, the flow characteristics around these structures are presented for different flow conditions. Mechanism of scouring phenomenon, the formation of vortex and its consequent effect is discussed for a straight channel. Effort was made towards estimation of scour depth around bridge piers under different flow conditions.

Keywords: bridge pier, computational fluid dynamics, multigrid, pier shape, scour

Procedia PDF Downloads 283
18621 Eradicating Micronutrient Deficiency through Biofortification

Authors: Ihtasham Hamza

Abstract:

In the contemporary world, where the West is afflicted by the diseases of excess nutrition, much of the rest globe suffers at the hands of hunger. A troubling constituent of hunger is micronutrient deficiency, also called hidden hunger. Major dependence on calorie-rich diets and low diet diversification are responsible for high malnutrition rates, especially in African and Asian countries. But the dilemma isn’t immune to solutions. Highlighting the substantial cause to be sole dependence on staples for food, biofortification has emerged as a novel tool to confront the widely distributed jeopardize of hidden hunger. Biofortification potentials the better nutritional approachability to commonalities overcoming various difficulties and reaching the doorstep. The crops associated with biofortification offer a rural-based involvement that, proposal, primarily reaches these more remote populations, which comprise a majority of the malnourished in many countries, and then penetrates to urban populations as assembly overages are marketed. Initial investments in agricultural research at a central location can generate high recurrent benefits at low cost as adapted biofortified cultivars become widely available in countries across time at low recurrent costs as opposed to supplementation which is comparatively expensive and requires continued financing over time, which may be imperilled by fluctuating political curiosity.

Keywords: biofortified crops, hunger, malnutrition, agricultural practices

Procedia PDF Downloads 280
18620 An Evaluation of the Impact of E-Banking on Operational Efficiency of Banks in Nigeria

Authors: Ibrahim Rabiu Darazo

Abstract:

The research has been conducted on the impact of E-banking on the operational efficiency of Banks in Nigeria, A case of some selected banks (Diamond Bank Plc, GTBankPlc, and Fidelity Bank Plc) in Nigeria. The research is a quantitative research which uses both primary and secondary sources of data collection. Questionnaire were used to obtained accurate data, where 150 Questionnaire were distributed among staff and customers of the three Banks , and the data collected where analysed using chi-square, whereas the secondary data where obtained from relevant text books, journals and relevant web sites. It is clear from the findings that, the use of e-banking by the banks has improved the efficiency of these banks, in terms of providing efficient services to customers electronically, using Internet Banking, Telephone Banking ATMs, reducing time taking to serve customers, e-banking allow new customers to open an account online, customers have access to their account at all the time 24/7.E-banking provide access to customers information from the data base and cost of check and postage were eliminated using e-banking. The recommendation at the end of the research include; the Banks should try to update their electronic gadgets, e-fraud(internal & external) should also be controlled, Banks shall employ qualified man power, Biometric ATMs shall be introduce to reduce fraud using ATM Cards, as it is use in other countries like USA.

Keywords: banks, electronic banking, operational efficiency of banks, biometric ATMs

Procedia PDF Downloads 321
18619 Hyperelastic Constitutive Modelling of the Male Pelvic System to Understand the Prostate Motion, Deformation and Neoplasms Location with the Influence of MRI-TRUS Fusion Biopsy

Authors: Muhammad Qasim, Dolors Puigjaner, Josep Maria López, Joan Herrero, Carme Olivé, Gerard Fortuny

Abstract:

Computational modeling of the human pelvis using the finite element (FE) method has become extremely important to understand the mechanics of prostate motion and deformation when transrectal ultrasound (TRUS) guided biopsy is performed. The number of reliable and validated hyperelastic constitutive FE models of the male pelvis region is limited, and given models did not precisely describe the anatomical behavior of pelvis organs, mainly of the prostate and its neoplasms location. The motion and deformation of the prostate during TRUS-guided biopsy makes it difficult to know the location of potential lesions in advance. When using this procedure, practitioners can only provide roughly estimations for the lesions locations. Consequently, multiple biopsy samples are required to target one single lesion. In this study, the whole pelvis model (comprised of the rectum, bladder, pelvic muscles, prostate transitional zone (TZ), and peripheral zone (PZ)) is used for the simulation results. An isotropic hyperelastic approach (Signorini model) was used for all the soft tissues except the vesical muscles. The vesical muscles are assumed to have a linear elastic behavior due to the lack of experimental data to determine the constants involved in hyperelastic models. The tissues and organ geometry is taken from the existing literature for 3D meshes. Then the biomechanical parameters were obtained under different testing techniques described in the literature. The acquired parametric values for uniaxial stress/strain data are used in the Signorini model to see the anatomical behavior of the pelvis model. The five mesh nodes in terms of small prostate lesions are selected prior to biopsy and each lesion’s final position is targeted when TRUS probe force of 30 N is applied at the inside rectum wall. Code_Aster open-source software is used for numerical simulations. Moreover, the overall effects of pelvis organ deformation were demonstrated when TRUS–guided biopsy is induced. The deformation of the prostate and neoplasms displacement showed that the appropriate material properties to organs altered the resulting lesion's migration parametrically. As a result, the distance traveled by these lesions ranged between 3.77 and 9.42 mm. The lesion displacement and organ deformation are compared and analyzed with our previous study in which we used linear elastic properties for all pelvic organs. Furthermore, the visual comparison of axial and sagittal slices are also compared, which is taken for Magnetic Resource Imaging (MRI) and TRUS images with our preliminary study.

Keywords: code-aster, magnetic resonance imaging, neoplasms, transrectal ultrasound, TRUS-guided biopsy

Procedia PDF Downloads 82
18618 Application of ANN for Estimation of Power Demand of Villages in Sulaymaniyah Governorate

Authors: A. Majeed, P. Ali

Abstract:

Before designing an electrical system, the estimation of load is necessary for unit sizing and demand-generation balancing. The system could be a stand-alone system for a village or grid connected or integrated renewable energy to grid connection, especially as there are non–electrified villages in developing countries. In the classical model, the energy demand was found by estimating the household appliances multiplied with the amount of their rating and the duration of their operation, but in this paper, information exists for electrified villages could be used to predict the demand, as villages almost have the same life style. This paper describes a method used to predict the average energy consumed in each two months for every consumer living in a village by Artificial Neural Network (ANN). The input data are collected using a regional survey for samples of consumers representing typical types of different living, household appliances and energy consumption by a list of information, and the output data are collected from administration office of Piramagrun for each corresponding consumer. The result of this study shows that the average demand for different consumers from four villages in different months throughout the year is approximately 12 kWh/day, this model estimates the average demand/day for every consumer with a mean absolute percent error of 11.8%, and MathWorks software package MATLAB version 7.6.0 that contains and facilitate Neural Network Toolbox was used.

Keywords: artificial neural network, load estimation, regional survey, rural electrification

Procedia PDF Downloads 117
18617 Moral Dilemmas, Difficulties in the Digital Games

Authors: YuPei Chang

Abstract:

In recent years, moral judgement tasks have served as an increasingly popular plot mechanism in digital gameplay. As a moral agency, the player's choice judgment in digital games is to shuttle between the real world and the game world. The purpose of the research is to explore the moral difficulties brewed by the interactive mechanism of the game and the moral choice of players. In the theoretical level, this research tries to combine moral disengagement, moral foundations theory, and gameplay as an aesthetic experience. And in the methodical level, this research tries to use methods that combine text analysis, diary method, and in-depth interviews. There are three research problems that will be solved in three stages. In the first stage, this project will explore how moral dilemmas are represented in game mechanics. In the second stage, this project will analyze the appearance and conflicts of moral dilemmas in game mechanics based on the five aspects of moral foundations theory. In the third stage, this project will try to understand the players' choices when they face the choices of moral dilemmas, as well as their explanations and reflections after making the decisions.

Keywords: morality, moral disengagement, moral foundations theory, PC game, gameplay, moral dilemmas, player

Procedia PDF Downloads 72
18616 Evaluation of Traffic Noise Level: A Case Study in Residential Area of Ishbiliyah , Kuwait

Authors: Jamal Almatawah, Hamad Matar, Abdulsalam Altemeemi

Abstract:

The World Health Organization (WHO) has recognized environmental noise as harmful pollution that causes adverse psychosocial and physiologic effects on human health. The motor vehicle is considered to be one of the main source of noise pollution. It is a universal phenomenon, and it has grown to the point that it has become a major concern for both the public and policymakers. The aim of this paper, therefore, is to investigate the Traffic noise levels and the contributing factors that affect its level, such as traffic volume, heavy-vehicle Speed and other metrological factors in Ishbiliyah as a sample of a residential area in Kuwait. Three types of roads were selected in Ishbiliyah expressway, major arterial and collector street. The other source of noise that interferes the traffic noise has also been considered in this study. Traffic noise level is measured and analyzed using the Bruel & Kjaer outdoor sound level meter 2250-L (2250 Light). The Count-Cam2 Video Camera has been used to collect the peak and off-peak traffic count. Ambient Weather WM-5 Handheld Weather Station is used for metrological factors such as temperature, humidity and wind speed. Also, the spot speed was obtained using the radar speed: Decatur Genesis model GHD-KPH. All the measurement has been detected at the same time (simultaneously). The results showed that the traffic noise level is over the allowable limit on all types of roads. The average equivalent noise level (LAeq) for the Expressway, Major arterial and Collector Street was 74.3 dB(A), 70.47 dB(A) and 60.84 dB(A), respectively. In addition, a Positive Correlation coefficient between the traffic noise versus traffic volume and between traffic noise versus 85th percentile speed was obtained. However, there was no significant relation and Metrological factors. Abnormal vehicle noise due to poor maintenance or user-enhanced exhaust noise was found to be one of the highest factors that affected the overall traffic noise reading.

Keywords: traffic noise, residential area, pollution, vehicle noise

Procedia PDF Downloads 57
18615 The Use of Digital Stories in the Development of Critical Literacy

Authors: Victoria Zenotz

Abstract:

For Fairclough (1989) critical literacy is a tool to enable readers and writers to build up meaning in discourse. More recently other authors (Leu et al., 2004) have included the new technology context in their definition of literacy. In their view being literate nowadays means to “successfully use and adapt to the rapidly changing information and communication technologies and contexts that continuously emerge in our world and influence all areas of our personal and professional lives.” (Leu et al., 2004: 1570). In this presentation the concept of critical literacy will be related to the creation of digital stories. In the first part of the presentation concepts such as literacy and critical literacy are examined. We consider that real social practices will help learners may improve their literacy level. Accordingly, we show some research, which was conducted at a secondary school in the north of Spain (2013-2014), to illustrate how the “writing” of digital stories may contribute to the development of critical literacy. The use of several instruments allowed the collection of data at the different stages of their creative process including watching and commenting models for digital stories, planning a storyboard, creating and selecting images, adding voices and background sounds, editing and sharing the final product. The results offer some valuable insights into learners’ literacy progress.

Keywords: literacy, computer assisted language learning, esl

Procedia PDF Downloads 394
18614 Hard Disk Failure Predictions in Supercomputing System Based on CNN-LSTM and Oversampling Technique

Authors: Yingkun Huang, Li Guo, Zekang Lan, Kai Tian

Abstract:

Hard disk drives (HDD) failure of the exascale supercomputing system may lead to service interruption and invalidate previous calculations, and it will cause permanent data loss. Therefore, initiating corrective actions before hard drive failures materialize is critical to the continued operation of jobs. In this paper, a highly accurate analysis model based on CNN-LSTM and oversampling technique was proposed, which can correctly predict the necessity of a disk replacement even ten days in advance. Generally, the learning-based method performs poorly on a training dataset with long-tail distribution, especially fault prediction is a very classic situation as the scarcity of failure data. To overcome the puzzle, a new oversampling was employed to augment the data, and then, an improved CNN-LSTM with the shortcut was built to learn more effective features. The shortcut transmits the results of the previous layer of CNN and is used as the input of the LSTM model after weighted fusion with the output of the next layer. Finally, a detailed, empirical comparison of 6 prediction methods is presented and discussed on a public dataset for evaluation. The experiments indicate that the proposed method predicts disk failure with 0.91 Precision, 0.91 Recall, 0.91 F-measure, and 0.90 MCC for 10 days prediction horizon. Thus, the proposed algorithm is an efficient algorithm for predicting HDD failure in supercomputing.

Keywords: HDD replacement, failure, CNN-LSTM, oversampling, prediction

Procedia PDF Downloads 75
18613 Molecular Dynamics Simulation Studies of Thermal Effects Created by High-Intensity, Ultra-Short Pulses Induced Cell Membrane Electroporation

Authors: Jiahui Song

Abstract:

The use of electric fields with high intensity (~ 100kV/cm or higher) and ultra short pulse durations (nanosecond range) has been a recent development. Most of the studies of electroporation have ignored possible thermal effects because of the small duration of the applied voltage pulses. However, it has been predicted membrane temperature gradients ranging from 0.2×109 to 109 K/m. This research focuses on thermal effects that drive for electroporative enhancements, even though the actual temperature values might not have changed appreciably from their equilibrium levels. The dynamics of pore formation with the application of an externally applied electric field is studied on the basis of molecular dynamics (MD) simulations using the GROMACS package. MD simulations of a lipid layer with constant electric field strength of 0.5 V/nm at 25 °C and 47 °C are implemented to simulate the appropriate thermal effects. The GROMACS provides the force fields for the lipid membranes, which is taken to comprise of dipalmitoyl-phosphatidyl-choline (DPPC) molecules. The water model mimicks the aqueous environment surrounding the membrane. Velocities of water and membrane molecules are generated randomly at each simulation run according to a Maxwellian distribution. The high background electric field is typically used in MD simulations to probe electroporation. It serves as an accelerated test of the pore formation process since low electric fields would take inordinately long simulation time. MD simulation shows no pore is formed in a 1-ns snapshot for a DPPC membrane set at a temperature of 25°C after a 0.5 V/nm electric field is applied. A nano-sized pore is clearly seen in a 0.75-ns snapshot on the same geometry, but with the membrane surfaces kept at temperatures of 47°C. And the pore increases at 1 ns. The MD simulation results suggest the possibility that the increase in temperature can result in different degrees of electrically stimulated bio-effects. The results points to the role of thermal effects in facilitating and accelerating the electroporation process.

Keywords: high-intensity, ultra-short, electroporation, thermal effects, molecular dynamics

Procedia PDF Downloads 46
18612 Role of Information and Communication Technology in Pharmaceutical Innovation: Case of Firms in Developing Countries

Authors: Ilham Benali, Nasser Hajji, Nawfel Acha

Abstract:

The pharmaceutical sector is ongoing different constraints related to the Research and Development (R&D) costs, the patents extinction, the demand pressing, the regulatory requirement and the generics development, which drive leading firms in the sector to undergo technological change and to shift to biotechnological paradigm. Based on a large literature review, we present a background of innovation trajectory in pharmaceutical industry and reasons behind this technological transformation. Then we investigate the role that Information and Communication Technology (ICT) is playing in this revolution. In order to situate pharmaceutical firms in developing countries in this trajectory, and to examine the degree of their involvement in the innovation process, we did not find any previous empirical work or sources generating gathered data that allow us to analyze this phenomenon. Therefore, and for the case of Morocco, we tried to do it from scratch by gathering relevant data of the last five years from different sources. As a result, only about 4% of all innovative drugs that have access to the local market in the mentioned period are made locally which substantiates that the industrial model in pharmaceutical sector in developing countries is based on the 'license model'. Finally, we present another alternative, based on ICT use and big data tools that can allow developing countries to shift from status of simple consumers to active actors in the innovation process.

Keywords: biotechnologies, developing countries, innovation, information and communication technology, pharmaceutical firms

Procedia PDF Downloads 147
18611 The Relationship between Spanish Economic Variables: Evidence from the Wavelet Techniques

Authors: Concepcion Gonzalez-Concepcion, Maria Candelaria Gil-Fariña, Celina Pestano-Gabino

Abstract:

We analyze six relevant economic and financial variables for the period 2000M1-2015M3 in the context of the Spanish economy: a financial index (IBEX35), a commodity (Crude Oil Price in euros), a foreign exchange index (EUR/USD), a bond (Spanish 10-Year Bond), the Spanish National Debt and the Consumer Price Index. The goal of this paper is to analyze the main relations between them by computing the Wavelet Power Spectrum and the Cross Wavelet Coherency associated with Morlet wavelets. By using a special toolbox in MATLAB, we focus our interest on the period variable. We decompose the time-frequency effects and improve the interpretation of the results by non-expert users in the theory of wavelets. The empirical evidence shows certain instability periods and reveals various changes and breaks in the causality relationships for sample data. These variables were individually analyzed with Daubechies Wavelets to visualize high-frequency variance, seasonality, and trend. The results are included in Proceeding 20th International Academic Conference, 2015, International Institute of Social and Economic Sciences (IISES), Madrid.

Keywords: economic and financial variables, Spain, time-frequency domain, wavelet coherency

Procedia PDF Downloads 233
18610 Spatial Disparity in Education and Medical Facilities: A Case Study of Barddhaman District, West Bengal, India

Authors: Amit Bhattacharyya

Abstract:

The economic scenario of any region does not show the real picture for the measurement of overall development. Therefore, economic development must be accompanied by social development to be able to make an assessment to measure the level of development. The spatial variation with respect to social development has been discussed taking into account the quality of functioning of a social system in a specific area. In this paper, an attempt has been made to study the spatial distribution of social infrastructural facilities and analyze the magnitude of regional disparities at inter- block level in Barddhman district. It starts with the detailed account of the selection process of social infrastructure indicators and describes the methodology employed in the empirical analysis. Analyzing the block level data, this paper tries to identify the disparity among the blocks in the levels of social development. The results have been subsequently explained using both statistical analysis and geo spatial technique. The paper reveals that the social development is not going on at the same rate in every part of the district. Health facilities and educational facilities are concentrated at some selected point. So overall development activities come to be concentrated in a few centres and the disparity is seen over the blocks.

Keywords: disparity, inter-block, social development, spatial variation

Procedia PDF Downloads 165
18609 Portrayal of Pak-US Relations in Perspective of Bin Laden Killing by the Leading American and British Newspapers: A Content Analysis of the Guardian, the Telegraph, Washington Post and the New York Times

Authors: Shahzad Ali

Abstract:

This article expounds the coverage of Pak-US relations in the context of Osama Bin Laden killing in four selected leading newspapers of UK and the US viz The Telegraph, The Guardian, The New York Times and The Washington Post during the specific time period of three months, i.e., April-June, 2011. The study is scrutinized on the basis of Operation Neptune Spear. The research study is theoretically linked with the propaganda model of Herman and Chomsky and the Shoemaker and Reeve‘s theory of content effect, specifically the impact of foreign policy as a driving instrument influencing the nature and treatment of the coverage of Pak-US relations. The relations between Pakistan and USA are of great importance in the context of Bin Laden. The Pak-US relations were less aggravated and hostile in pre OBL period but post OBL period worsened these relations. It is also interesting to mention that the policies of these newspapers were dependent on the foreign policy of their countries. It was explored that the news coverage was found favorable as and when the relations Pakistan and the US or UK were cordial and smooth. And it turned into negative and unfavorable when the relations were in strain and deteriorating and endorsed various studies that the Western mainstream media tried to be more nationalistic in projecting official stance while covering foreign coverage instead of adhering to universal cannon of journalism, i.e., objectivity, fairness, and neutrality, etc. Overall 219 news items of these four selected newspapers regarding Pak-US relations in the context of pre and post OBL’s killing operation were undertaken for analysis. The ratio of negative slants was found higher as compared to the positive and neutral slants. Besides, the ratio of news items with unfavorable frames was found higher in post OBL operation in comparison of pre-era of the coverage related to Pak-US relations in the leading British and American newspapers.

Keywords: Osama Bin Laden, Pakistan, USA, UK, relations, Guardian, Washington Post, Telegraph, New York Times, Operation Neptune Spear

Procedia PDF Downloads 166
18608 The Effect of Artificial Intelligence on Communication and Information Systems

Authors: Sameh Ibrahim Ghali Hanna

Abstract:

Information system (IS) are fairly crucial in the operation of private and public establishments in growing and developed international locations. Growing countries are saddled with many project failures throughout the implementation of records systems. However, successful information systems are greatly wished for in developing nations in an effort to decorate their economies. This paper is extraordinarily critical in view of the high failure fee of data structures in growing nations, which desire to be decreased to minimal proper levels by means of advocated interventions. This paper centers on a review of IS development in developing international locations. The paper gives evidence of the IS successes and screw-ups in developing nations and posits a version to deal with the IS failures. The proposed model can then be utilized by means of growing nations to lessen their IS mission implementation failure fee. A contrast is drawn between IS improvement in growing international locations and evolved international locations. The paper affords valuable records to assist in decreasing IS failure, and growing IS models and theories on IS development for developing countries.

Keywords: research information systems (RIS), research information, heterogeneous sources, data quality, data cleansing, science system, standardization artificial intelligence, AI, enterprise information system, EIS, integration developing countries, information systems, IS development, information systems failure, information systems success, information systems success model

Procedia PDF Downloads 7