Search results for: dual-SLA indicators for computing force applications
10010 Intelligent Recognition Tools for Industrial Automation
Authors: Amin Nazerzadeh, Afsaneh Nouri Houshyar , Azadeh Noori Hoshyar
Abstract:
With the rapid growing of information technology, the industry and manufacturing systems are becoming more automated. Therefore, achieving the highly accurate automatic systems with reliable security is becoming more critical. Biometrics that refers to identifying individual based on physiological or behavioral traits are unique identifiers provide high reliability and security in different industrial systems. As biometric cannot easily be transferred between individuals or copied, it has been receiving extensive attention. Due to the importance of security applications, this paper provides an overview on biometrics and discuss about background, types and applications of biometric as an effective tool for the industrial applications.Keywords: Industial and manufacturing applications, intelligence and security, information technology, recognition; security technology; biometrics
Procedia PDF Downloads 15510009 Artificial Neurons Based on Memristors for Spiking Neural Networks
Authors: Yan Yu, Wang Yu, Chen Xintong, Liu Yi, Zhang Yanzhong, Wang Yanji, Chen Xingyu, Zhang Miaocheng, Tong Yi
Abstract:
Neuromorphic computing based on spiking neural networks (SNNs) has emerged as a promising avenue for building the next generation of intelligent computing systems. Owing to its high-density integration, low power, and outstanding nonlinearity, memristors have attracted emerging attention on achieving SNNs. However, fabricating a low-power and robust memristor-based spiking neuron without extra electrical components is still a challenge for brain-inspired systems. In this work, we demonstrate a TiO₂-based threshold switching (TS) memristor to emulate a leaky integrate-and-fire (LIF) neuron without auxiliary circuits, used to realize single layer fully connected (FC) SNNs. Moreover, our TiO₂-based resistive switching (RS) memristors realize spiking-time-dependent-plasticity (STDP), originating from the Ag diffusion-based filamentary mechanism. This work demonstrates that TiO2-based memristors may provide an efficient method to construct hardware neuromorphic computing systems.Keywords: leaky integrate-and-fire, memristor, spiking neural networks, spiking-time-dependent-plasticity
Procedia PDF Downloads 13410008 Architecture of a Preliminary Course on Computational Thinking
Authors: Mintu Philip, Renumol V. G.
Abstract:
An introductory programming course is a major challenge faced in Computing Education. Many of the introductory programming courses fail because student concentrate mainly on writing programs using a programming language rather than involving in problem solving. Computational thinking is a general approach to solve problems. This paper proposes a new preliminary course that aims to develop computational thinking skills in students, which may help them to become good programmers. The proposed course is designed based on the four basic components of computational thinking - abstract thinking, logical thinking, modeling thinking and constructive thinking. In this course, students are engaged in hands-on problem solving activities using a new problem solving model proposed in this paper.Keywords: computational thinking, computing education, abstraction, constructive thinking, modelling thinking
Procedia PDF Downloads 45610007 Design and Assessment of Base Isolated Structures under Spectrum-Compatible Bidirectional Earthquakes
Authors: Marco Furinghetti, Alberto Pavese, Michele Rinaldi
Abstract:
Concave Surface Slider devices have been more and more used in real applications for seismic protection of both bridge and building structures. Several research activities have been carried out, in order to investigate the lateral response of such a typology of devices, and a reasonably high level of knowledge has been reached. If radial analysis is performed, the frictional force is always aligned with respect to the restoring force, whereas under bidirectional seismic events, a bi-axial interaction of the directions of motion occurs, due to the step-wise projection of the main frictional force, which is assumed to be aligned to the trajectory of the isolator. Nonetheless, if non-linear time history analyses have to be performed, standard codes provide precise rules for the definition of an averagely spectrum-compatible set of accelerograms in radial conditions, whereas for bidirectional motions different combinations of the single components spectra can be found. Moreover, nowadays software for the adjustment of natural accelerograms are available, which lead to a higher quality of spectrum-compatibility and to a smaller dispersion of results for radial motions. In this endeavor a simplified design procedure is defined, for building structures, base-isolated by means of Concave Surface Slider devices. Different case study structures have been analyzed. In a first stage, the capacity curve has been computed, by means of non-linear static analyses on the fixed-base structures: inelastic fiber elements have been adopted and different direction angles of lateral forces have been studied. Thanks to these results, a linear elastic Finite Element Model has been defined, characterized by the same global stiffness of the linear elastic branch of the non-linear capacity curve. Then, non-linear time history analyses have been performed on the base-isolated structures, by applying seven bidirectional seismic events. The spectrum-compatibility of bidirectional earthquakes has been studied, by considering different combinations of single components and adjusting single records: thanks to the proposed procedure, results have shown a small dispersion and a good agreement in comparison to the assumed design values.Keywords: concave surface slider, spectrum-compatibility, bidirectional earthquake, base isolation
Procedia PDF Downloads 29210006 Non-Linear Load-Deflection Response of Shape Memory Alloys-Reinforced Composite Cylindrical Shells under Uniform Radial Load
Authors: Behrang Tavousi Tehrani, Mohammad-Zaman Kabir
Abstract:
Shape memory alloys (SMA) are often implemented in smart structures as the active components. Their ability to recover large displacements has been used in many applications, including structural stability/response enhancement and active structural acoustic control. SMA wires or fibers can be embedded with composite cylinders to increase their critical buckling load, improve their load-deflection behavior, and reduce the radial deflections under various thermo-mechanical loadings. This paper presents a semi-analytical investigation on the non-linear load-deflection response of SMA-reinforced composite circular cylindrical shells. The cylinder shells are under uniform external pressure load. Based on first-order shear deformation shell theory (FSDT), the equilibrium equations of the structure are derived. One-dimensional simplified Brinson’s model is used for determining the SMA recovery force due to its simplicity and accuracy. Airy stress function and Galerkin technique are used to obtain non-linear load-deflection curves. The results are verified by comparing them with those in the literature. Several parametric studies are conducted in order to investigate the effect of SMA volume fraction, SMA pre-strain value, and SMA activation temperature on the response of the structure. It is shown that suitable usage of SMA wires results in a considerable enhancement in the load-deflection response of the shell due to the generation of the SMA tensile recovery force.Keywords: airy stress function, cylindrical shell, Galerkin technique, load-deflection curve, recovery stress, shape memory alloy
Procedia PDF Downloads 18810005 Integrated Performance Management System a Conceptual Design for PT. XYZ
Authors: Henrie Yunianto, Dermawan Wibisono
Abstract:
PT. XYZ is a family business (private company) in Indonesia that provide an educational program and consultation services. Since its establishment in 2011, the company has run without any strategic management system implemented. Though the company could survive until now. The management of PT. XYZ sees the business opportunity for such product is huge, even though the targeted market is very specific (niche), the volume is large (due to large population of Indonesia) and numbers of competitors are low (now). It can be said if the product life cycle is in between ‘Introduction stage’ and ‘growth’ stage. It is observed that nowadays the new entrants (competitors) are increasing, thus PT. XYZ consider reacting in facing the intense business rivalry by conducting the business in an appropriate manner. A Performance Management System is important to be implemented in accordance with the business sustainability and growth. The framework of Performance Management System chosen is Integrated Performance Management System (IPMS). IPMS framework has the advantages of its simplicity, linkage between its business variables and indicators where the company can see the connections between all factors measured. IPMS framework consists of perspectives: (1) Business Result, (2) Internal Processes, (3) Resource Availability. Variables and indicators were examined through deep analysis of the business external and internal environments, Strength-Weakness-Opportunity-Threat (SWOT) analysis, Porter’s five forces analysis. Analytical Hierarchy Process (AHP) analysis was then used to quantify the weight of each variable/indicators. AHP is needed since in this study, PT. XYZ, the data of existing performance indicator was not available. Later, where the IPMS is implemented, the real data measured can be examined to determine the weight factor of each indicators using correlation analysis (or other methods). In this study of IPMS design for PT. XYZ, the analysis shows that with current company goals, along with the AHP methodology, the critical indicators for each perspective are: (1) Business results: Customer satisfaction and Employee satisfaction, (2) Internal process: Marketing performance, Supplier quality, Production quality, Continues improvement; (3) Resources Availability: Leadership and company culture & value, Personal Competences, Productivity. Company and/or organization require performance management system to help them in achieving their vision and mission. Company strategy will be effectively defined and addressed by using performance management system. Integrated Performance Management System (IPMS) framework and AHP analysis help us in quantifying the factors which influence the business output expected.Keywords: analytical hierarchy process, business strategy, differentiation strategy, integrated performance management system
Procedia PDF Downloads 30710004 Simulations of NACA 65-415 and NACA 64-206 Airfoils Using Computational Fluid Dynamics
Authors: David Nagy
Abstract:
This paper exemplifies the influence of the purpose of an aircraft on the aerodynamic properties of its airfoil. In particular, the research takes into consideration two types of aircraft, namely cargo aircraft and military high-speed aircraft and compares their airfoil characteristics using their NACA airfoils as well as computational fluid dynamics. The results show that airfoils of aircraft designed for cargo have a heavier focus on maintaining a large lift force whereas speed-oriented airplanes focus on minimizing the drag force.Keywords: aerodynamic simulation, aircraft, airfoil, computational fluid dynamics, lift to drag ratio, NACA 64-206, NACA 65-415
Procedia PDF Downloads 38810003 Study of Drawing Characteristics due to Friction between the Materials by FEM
Authors: Won Jin Ryu, Mok Tan Ahn, Hyeok Choi, Joon Hong Park, Sung Min Kim, Jong Bae Park
Abstract:
Pipes for offshore plants require specifications that satisfy both high strength and high corrosion resistance. Therefore, currently, clad pipes are used in offshore plants. Clad pipes can be made using either overlay welding or clad plates. The present study was intended to figure out the effects of friction between two materials, which is a factor that affects two materials, were figured out using FEM to make clad pipes through heterogenous material drawing instead of the two methods mentioned above. Therefore, FEM has conducted while all other variables that the variable friction was fixed. The experimental results showed increases in pullout force along with increases in the friction in the boundary layer.Keywords: clad pipe, FEM, friction, pullout force
Procedia PDF Downloads 49410002 Perceptions on Development of the Deaf in Higher Education Level: The Case of Special Education Students in Tiaong, Quezon, Philippines
Authors: Ashley Venerable, Rosario Tatlonghari
Abstract:
This study identified how college deaf students of Bartimaeus Center for Alternative Learning in Tiaong, Quezon, Philippines view development using visual communication techniques and generating themes from responses. Complete enumeration was employed. Guided by Constructivist Theory of Perception, past experiences and stored information influenced perception. These themes of development emerged: social development; pleasant environment; interpersonal relationships; availability of resources; employment; infrastructure development; values; and peace and security. Using the National Economic and Development Authority development indicators, findings showed the deaf students’ views on development were similar from the mainstream views. Responses also became more meaningful through visual communication techniques.Keywords: deaf, development, perception, development indicators, visual communication
Procedia PDF Downloads 43110001 Extracellular Enzymes as Promising Soil Health Indicators: Assessing Response to Different Land Uses Using Long-Term Experiments
Authors: Munisath Khandoker, Stephan Haefele, Andy Gregory
Abstract:
Extracellular enzymes play a key role in soil organic carbon (SOC) decomposition and nutrient cycling and are known indicators for soil health; however, it is not understood how these enzymes respond to different land uses and their relationships to other soil properties have not been extensively reviewed. The relationships among the activities of three soil enzymes: β-glucosaminidase (NAG), phosphomonoesterase (PHO) and β-glucosidase (GLU), were examined. The impact of soil organic amendments, soil types and land management on soil enzyme activities were reviewed, and it was hypothesized that soils with increased SOC have increased enzyme activity. Long-term experiments at Rothamsted Research Woburn and Harpenden sites in the UK were used to evaluate how different management practices affect enzyme activity involved in carbon (C) and nitrogen (N) cycling in the soil. Samples were collected from soils with different organic treatments such as straw, farmyard manure (FYM), compost additions, cover crops and permanent grass cover to assess whether SOC can be linked with increased levels of enzymatic activity and what influence, if any, enzymatic activity has on total C and N in the soil. Investigating the interactions of important enzymes with soil characteristics and SOC can help to better understand the health of soils. Studies on long-term experiments with known histories and large datasets can better help with this. SOC tends to decrease during land use changes from natural ecosystems to agricultural systems; therefore, it is imperative that agricultural lands find ways to increase and/or maintain SOC in the soil.Keywords: biological soil health indicators, extracellular enzymes, soil health, soil, microbiology
Procedia PDF Downloads 7210000 3D Steady and Transient Centrifugal Pump Flow within Ansys CFX and OpenFOAM
Authors: Clement Leroy, Guillaume Boitel
Abstract:
This paper presents a comparative benchmarking review of a steady and transient three-dimensional (3D) flow computations in centrifugal pump using commercial (AnsysCFX) and open source (OpenFOAM) computational fluid dynamics (CFD) software. In centrifugal rotor-dynamic pump, the fluid enters in the impeller along to the rotating axis to be accelerated in order to increase the pressure, flowing radially outward into another stage, vaned diffuser or volute casing, from where it finally exits into a downstream pipe. Simulations are carried out at the best efficiency point (BEP) and part load, for single-phase flow with several turbulence models. The results are compared with overall performance report from experimental data. The use of CFD technology in industry is still limited by the high computational costs, and even more by the high cost of commercial CFD software and high-performance computing (HPC) licenses. The main objectives of the present study are to define OpenFOAM methodology for high-quality 3D steady and transient turbomachinery CFD simulation to conduct a thorough time-accurate performance analysis. On the other hand a detailed comparisons between computational methods, features on latest Ansys release 18 and OpenFOAM is investigated to assess the accuracy and industrial applications of those solvers. Finally an automated connected workflow (IoT) for turbine blade applications is presented.Keywords: benchmarking, CFX, internet of things, openFOAM, time-accurate, turbomachinery
Procedia PDF Downloads 2049999 Protocol for Dynamic Load Distributed Low Latency Web-Based Augmented Reality and Virtual Reality
Authors: Rohit T. P., Sahil Athrij, Sasi Gopalan
Abstract:
Currently, the content entertainment industry is dominated by mobile devices. As the trends slowly shift towards Augmented/Virtual Reality applications the computational demands on these devices are increasing exponentially and we are already reaching the limits of hardware optimizations. This paper proposes a software solution to this problem. By leveraging the capabilities of cloud computing we can offload the work from mobile devices to dedicated rendering servers that are way more powerful. But this introduces the problem of latency. This paper introduces a protocol that can achieve high-performance low latency Augmented/Virtual Reality experience. There are two parts to the protocol, 1) In-flight compression The main cause of latency in the system is the time required to transmit the camera frame from client to server. The round trip time is directly proportional to the amount of data transmitted. This can therefore be reduced by compressing the frames before sending. Using some standard compression algorithms like JPEG can result in minor size reduction only. Since the images to be compressed are consecutive camera frames there won't be a lot of changes between two consecutive images. So inter-frame compression is preferred. Inter-frame compression can be implemented efficiently using WebGL but the implementation of WebGL limits the precision of floating point numbers to 16bit in most devices. This can introduce noise to the image due to rounding errors, which will add up eventually. This can be solved using an improved interframe compression algorithm. The algorithm detects changes between frames and reuses unchanged pixels from the previous frame. This eliminates the need for floating point subtraction thereby cutting down on noise. The change detection is also improved drastically by taking the weighted average difference of pixels instead of the absolute difference. The kernel weights for this comparison can be fine-tuned to match the type of image to be compressed. 2) Dynamic Load distribution Conventional cloud computing architectures work by offloading as much work as possible to the servers, but this approach can cause a hit on bandwidth and server costs. The most optimal solution is obtained when the device utilizes 100% of its resources and the rest is done by the server. The protocol balances the load between the server and the client by doing a fraction of the computing on the device depending on the power of the device and network conditions. The protocol will be responsible for dynamically partitioning the tasks. Special flags will be used to communicate the workload fraction between the client and the server and will be updated in a constant interval of time ( or frames ). The whole of the protocol is designed so that it can be client agnostic. Flags are available to the client for resetting the frame, indicating latency, switching mode, etc. The server can react to client-side changes on the fly and adapt accordingly by switching to different pipelines. The server is designed to effectively spread the load and thereby scale horizontally. This is achieved by isolating client connections into different processes.Keywords: 2D kernelling, augmented reality, cloud computing, dynamic load distribution, immersive experience, mobile computing, motion tracking, protocols, real-time systems, web-based augmented reality application
Procedia PDF Downloads 729998 Burnishing of Aluminum-Magnesium-Graphite Composites
Authors: Mohammed T. Hayajneh, Adel Mahmood Hassan, Moath AL-Qudah
Abstract:
Burnishing is increasingly used as a finishing operation to improve surface roughness and surface hardness. This can be achieved by applying a hard ball or roller onto metallic surfaces under pressure, in order to achieve many advantages in the metallic surface. In the present work, the feed rate, speed and force have been considered as the basic burnishing parameters to study the surface roughness and surface hardness of metallic matrix composites. The considered metal matrix composites were made from Aluminum-Magnesium-Graphite with five different weight percentage of graphite. Both effects of burnishing parameters mentioned above and the graphite percentage on the surface hardness and surface roughness of the metallic matrix composites were studied. The results of this investigation showed that the surface hardness of the metallic composites increases with the increase of the burnishing force and decreases with the increase in the burnishing feed rate and burnishing speed. The surface roughness of the metallic composites decreases with the increasing of the burnishing force, feed rate, and speed to certain values, then it starts to increase. On the other hand, the increase in the weight percentage of the graphite in the considered composites causes a decrease in the surface hardness and an increase in the surface roughness.Keywords: burnishing process, Al-Mg-Graphite composites, surface hardness, surface roughness
Procedia PDF Downloads 4859997 Loan Repayment Prediction Using Machine Learning: Model Development, Django Web Integration and Cloud Deployment
Authors: Seun Mayowa Sunday
Abstract:
Loan prediction is one of the most significant and recognised fields of research in the banking, insurance, and the financial security industries. Some prediction systems on the market include the construction of static software. However, due to the fact that static software only operates with strictly regulated rules, they cannot aid customers beyond these limitations. Application of many machine learning (ML) techniques are required for loan prediction. Four separate machine learning models, random forest (RF), decision tree (DT), k-nearest neighbour (KNN), and logistic regression, are used to create the loan prediction model. Using the anaconda navigator and the required machine learning (ML) libraries, models are created and evaluated using the appropriate measuring metrics. From the finding, the random forest performs with the highest accuracy of 80.17% which was later implemented into the Django framework. For real-time testing, the web application is deployed on the Alibabacloud which is among the top 4 biggest cloud computing provider. Hence, to the best of our knowledge, this research will serve as the first academic paper which combines the model development and the Django framework, with the deployment into the Alibaba cloud computing application.Keywords: k-nearest neighbor, random forest, logistic regression, decision tree, django, cloud computing, alibaba cloud
Procedia PDF Downloads 1359996 A Framework for an Automated Decision Support System for Selecting Safety-Conscious Contractors
Authors: Rawan A. Abdelrazeq, Ahmed M. Khalafallah, Nabil A. Kartam
Abstract:
Selection of competent contractors for construction projects is usually accomplished through competitive bidding or negotiated contracting in which the contract bid price is the basic criterion for selection. The evaluation of contractor’s safety performance is still not a typical criterion in the selection process, despite the existence of various safety prequalification procedures. There is a critical need for practical and automated systems that enable owners and decision makers to evaluate contractor safety performance, among other important contractor selection criteria. These systems should ultimately favor safety-conscious contractors to be selected by the virtue of their past good safety records and current safety programs. This paper presents an exploratory sequential mixed-methods approach to develop a framework for an automated decision support system that evaluates contractor safety performance based on a multitude of indicators and metrics that have been identified through a comprehensive review of construction safety research, and a survey distributed to domain experts. The framework is developed in three phases: (1) determining the indicators that depict contractor current and past safety performance; (2) soliciting input from construction safety experts regarding the identified indicators, their metrics, and relative significance; and (3) designing a decision support system using relational database models to integrate the identified indicators and metrics into a system that assesses and rates the safety performance of contractors. The proposed automated system is expected to hold several advantages including: (1) reducing the likelihood of selecting contractors with poor safety records; (2) enhancing the odds of completing the project safely; and (3) encouraging contractors to exert more efforts to improve their safety performance and practices in order to increase their bid winning opportunities which can lead to significant safety improvements in the construction industry. This should prove useful to decision makers and researchers, alike, and should help improve the safety record of the construction industry.Keywords: construction safety, contractor selection, decision support system, relational database
Procedia PDF Downloads 2809995 Single Ion Transport with a Single-Layer Graphene Nanopore
Authors: Vishal V. R. Nandigana, Mohammad Heiranian, Narayana R. Aluru
Abstract:
Graphene material has found tremendous applications in water desalination, DNA sequencing and energy storage. Multiple nanopores are etched to create opening for water desalination and energy storage applications. The nanopores created are of the order of 3-5 nm allowing multiple ions to transport through the pore. In this paper, we present for the first time, molecular dynamics study of single ion transport, where only one ion passes through the graphene nanopore. The diameter of the graphene nanopore is of the same order as the hydration layers formed around each ion. Analogous to single electron transport resulting from ionic transport is observed for the first time. The current-voltage characteristics of such a device are similar to single electron transport in quantum dots. The current is blocked until a critical voltage, as the ions are trapped inside a hydration shell. The trapped ions have a high energy barrier compared to the applied input electrical voltage, preventing the ion to break free from the hydration shell. This region is called “Coulomb blockade region”. In this region, we observe zero transport of ions inside the nanopore. However, when the electrical voltage is beyond the critical voltage, the ion has sufficient energy to break free from the energy barrier created by the hydration shell to enter into the pore. Thus, the input voltage can control the transport of the ion inside the nanopore. The device therefore acts as a binary storage unit, storing 0 when no ion passes through the pore and storing 1 when a single ion passes through the pore. We therefore postulate that the device can be used for fluidic computing applications in chemistry and biology, mimicking a computer. Furthermore, the trapped ion stores a finite charge in the Coulomb blockade region; hence the device also acts a super capacitor.Keywords: graphene nanomembrane, single ion transport, Coulomb blockade, nanofluidics
Procedia PDF Downloads 3219994 A New Protocol Ensuring Users' Privacy in Pervasive Environment
Authors: Mohammed Nadir Djedid, Abdallah Chouarfia
Abstract:
Transparency of the system and its integration into the natural environment of the user are some of the important features of pervasive computing. But these characteristics that are considered as the strongest points of pervasive systems are also their weak points in terms of the user’s privacy. The privacy in pervasive systems involves more than the confidentiality of communications and concealing the identity of virtual users. The physical presence and behavior of the user in the pervasive space cannot be completely hidden and can reveal the secret of his/her identity and affect his/her privacy. This paper shows that the application of major techniques for protecting the user’s privacy still insufficient. A new solution named Shadow Protocol is proposed, which allows the users to authenticate and interact with the surrounding devices within an ubiquitous computing environment while preserving their privacy.Keywords: pervasive systems, identification, authentication, privacy
Procedia PDF Downloads 4829993 A Hazard Rate Function for the Time of Ruin
Authors: Sule Sahin, Basak Bulut Karageyik
Abstract:
This paper introduces a hazard rate function for the time of ruin to calculate the conditional probability of ruin for very small intervals. We call this function the force of ruin (FoR). We obtain the expected time of ruin and conditional expected time of ruin from the exact finite time ruin probability with exponential claim amounts. Then we introduce the FoR which gives the conditional probability of ruin and the condition is that ruin has not occurred at time t. We analyse the behavior of the FoR function for different initial surpluses over a specific time interval. We also obtain FoR under the excess of loss reinsurance arrangement and examine the effect of reinsurance on the FoR.Keywords: conditional time of ruin, finite time ruin probability, force of ruin, reinsurance
Procedia PDF Downloads 4059992 US Airlines Performance and Its Connection with Service Quality
Authors: Nicole Kalemba, Fernando Campa-Planas, Ana-Beatriz Hernández-Lara, Maria Victória Sánchez-Rebull
Abstract:
The purpose of this paper is to determine the effects of service quality on US airlines’ economic performance. In order to cover this goal, it has been considered four different indexes of service quality in the air transportation industry, and also two indicators of economic performance, revenues and return on investment (ROI). Data from American airline companies over a period that covers from 2006 to 2013 have been used in order to determine if airlines’ profitability increases when service quality improves. Considering the effects on airlines’ profitability, the results confirm the positive and significant influence of service quality on the ROI of the companies in our study. Meanwhile, a non-significant effect was found for airline revenues related to quality. No previous research in this area has been done and these findings could encourage airline companies to invest in quality as far as this policy can have a return on their profitability.Keywords: airlines, economic performance, key performance indicators, quality
Procedia PDF Downloads 4739991 Investigating the Relationship between Bank and Cloud Provider
Authors: Hatim Elhag
Abstract:
Banking and Financial Service Institutions are possibly the most advanced in terms of technology adoption and use it as a key differentiator. With high levels of business process automation, maturity in the functional portfolio, straight through processing and proven technology outsourcing benefits, Banking sector stand to benefit significantly from Cloud computing capabilities. Additionally, with complex Compliance and Regulatory policies, combined with expansive products and geography coverage, the business impact is even greater. While the benefits are exponential, there are also significant challenges in adopting this model– including Legal, Security, Performance, Reliability, Transformation complexity, Operating control and Governance and most importantly proof for the promised cost benefits. However, new architecture designed should be implemented to align this approach.Keywords: security, cloud, banking sector, cloud computing
Procedia PDF Downloads 4999990 Soliton Interaction in Multi-Core Optical Fiber: Application to WDM System
Authors: S. Arun Prakash, V. Malathi, M. S. Mani Rajan
Abstract:
The analytical bright two soliton solution of the 3-coupled nonlinear Schrödinger equations with variable coefficients in birefringent optical fiber is obtained by Darboux transformation method. To the design of ultra-speed optical devices, Soliton interaction and control in birefringence fiber is investigated. Lax pair is constructed for N coupled NLS system through AKNS method. Using two soliton solution, we demonstrate different interaction behaviors of solitons in birefringent fiber depending on the choice of control parameters. Our results shows that interactions of optical solitons have some specific applications such as construction of logic gates, optical computing, soliton switching, and soliton amplification in wavelength division multiplexing (WDM) system.Keywords: optical soliton, soliton interaction, soliton switching, WDM
Procedia PDF Downloads 5059989 Translation Directionality: An Eye Tracking Study
Authors: Elahe Kamari
Abstract:
Research on translation process has been conducted for more than 20 years, investigating various issues and using different research methodologies. Most recently, researchers have started to use eye tracking to study translation processes. They believed that the observable, measurable data that can be gained from eye tracking are indicators of unobservable cognitive processes happening in the translators’ mind during translation tasks. The aim of this study was to investigate directionality in translation processes through using eye tracking. The following hypotheses were tested: 1) processing the target text requires more cognitive effort than processing the source text, in both directions of translation; 2) L2 translation tasks on the whole require more cognitive effort than L1 tasks; 3) cognitive resources allocated to the processing of the source text is higher in L1 translation than in L2 translation; 4) cognitive resources allocated to the processing of the target text is higher in L2 translation than in L1 translation; and 5) in both directions non-professional translators invest more cognitive effort in translation tasks than do professional translators. The performance of a group of 30 male professional translators was compared with that of a group of 30 male non-professional translators. All the participants translated two comparable texts one into their L1 (Persian) and the other into their L2 (English). The eye tracker measured gaze time, average fixation duration, total task length and pupil dilation. These variables are assumed to measure the cognitive effort allocated to the translation task. The data derived from eye tracking only confirmed the first hypothesis. This hypothesis was confirmed by all the relevant indicators: gaze time, average fixation duration and pupil dilation. The second hypothesis that L2 translation tasks requires allocation of more cognitive resources than L1 translation tasks has not been confirmed by all four indicators. The third hypothesis that source text processing requires more cognitive resources in L1 translation than in L2 translation and the fourth hypothesis that target text processing requires more cognitive effort in L2 translation than L1 translation were not confirmed. It seems that source text processing in L2 translation can be just as demanding as in L1 translation. The final hypothesis that non-professional translators allocate more cognitive resources for the same translation tasks than do the professionals was partially confirmed. One of the indicators, average fixation duration, indicated higher cognitive effort-related values for professionals.Keywords: translation processes, eye tracking, cognitive resources, directionality
Procedia PDF Downloads 4639988 Smart Structures for Cost Effective Cultural Heritage Preservation
Authors: Tamara Trček Pečak, Andrej Mohar, Denis Trček
Abstract:
This article investigates the latest technological means, which deploy smart structures that are based on (advanced) wireless sensors technologies and ubiquitous computing in general in order to support the above mentioned decision making. Based on two years of in-field research experiences it gives their analysis for these kinds of purposes and provides appropriate architectures and architectural solutions. Moreover, the directions for future research are stated, because these technologies are currently the most promising ones to enable cost-effective preservation of cultural heritage not only in uncontrolled places, but also in general.Keywords: smart structures, wireless sensors, sensors networks, green computing, cultural heritage preservation, monitoring, cost effectiveness
Procedia PDF Downloads 4469987 SPARK: An Open-Source Knowledge Discovery Platform That Leverages Non-Relational Databases and Massively Parallel Computational Power for Heterogeneous Genomic Datasets
Authors: Thilina Ranaweera, Enes Makalic, John L. Hopper, Adrian Bickerstaffe
Abstract:
Data are the primary asset of biomedical researchers, and the engine for both discovery and research translation. As the volume and complexity of research datasets increase, especially with new technologies such as large single nucleotide polymorphism (SNP) chips, so too does the requirement for software to manage, process and analyze the data. Researchers often need to execute complicated queries and conduct complex analyzes of large-scale datasets. Existing tools to analyze such data, and other types of high-dimensional data, unfortunately suffer from one or more major problems. They typically require a high level of computing expertise, are too simplistic (i.e., do not fit realistic models that allow for complex interactions), are limited by computing power, do not exploit the computing power of large-scale parallel architectures (e.g. supercomputers, GPU clusters etc.), or are limited in the types of analysis available, compounded by the fact that integrating new analysis methods is not straightforward. Solutions to these problems, such as those developed and implemented on parallel architectures, are currently available to only a relatively small portion of medical researchers with access and know-how. The past decade has seen a rapid expansion of data management systems for the medical domain. Much attention has been given to systems that manage phenotype datasets generated by medical studies. The introduction of heterogeneous genomic data for research subjects that reside in these systems has highlighted the need for substantial improvements in software architecture. To address this problem, we have developed SPARK, an enabling and translational system for medical research, leveraging existing high performance computing resources, and analysis techniques currently available or being developed. It builds these into The Ark, an open-source web-based system designed to manage medical data. SPARK provides a next-generation biomedical data management solution that is based upon a novel Micro-Service architecture and Big Data technologies. The system serves to demonstrate the applicability of Micro-Service architectures for the development of high performance computing applications. When applied to high-dimensional medical datasets such as genomic data, relational data management approaches with normalized data structures suffer from unfeasibly high execution times for basic operations such as insert (i.e. importing a GWAS dataset) and the queries that are typical of the genomics research domain. SPARK resolves these problems by incorporating non-relational NoSQL databases that have been driven by the emergence of Big Data. SPARK provides researchers across the world with user-friendly access to state-of-the-art data management and analysis tools while eliminating the need for high-level informatics and programming skills. The system will benefit health and medical research by eliminating the burden of large-scale data management, querying, cleaning, and analysis. SPARK represents a major advancement in genome research technologies, vastly reducing the burden of working with genomic datasets, and enabling cutting edge analysis approaches that have previously been out of reach for many medical researchers.Keywords: biomedical research, genomics, information systems, software
Procedia PDF Downloads 2709986 Development of Value Based Planning Methodology Incorporating Risk Assessment for Power Distribution Network
Authors: Asnawi Mohd Busrah, Au Mau Teng, Tan Chin Hooi, Lau Chee Chong
Abstract:
This paper describes value based planning (VBP) methodology incorporating risk assessment as an enhanced and more practical approach to evaluate distribution network projects in Peninsular Malaysia. Assessment indicators associated with economics, performance and risks are formulated to evaluate distribution projects to quantify their benefits against investment. The developed methodology is implemented in a web-based software customized to capture investment and network data, compute assessment indicators and rank the proposed projects according to their benefits. Value based planning approach addresses economic factors in the power distribution planning assessment, so as to minimize cost solution to the power utility while at the same time provide maximum benefits to customers.Keywords: value based planning, distribution network, value of loss load (VoLL), energy not served (ENS)
Procedia PDF Downloads 4809985 Statistic Regression and Open Data Approach for Identifying Economic Indicators That Influence e-Commerce
Authors: Apollinaire Barme, Simon Tamayo, Arthur Gaudron
Abstract:
This paper presents a statistical approach to identify explanatory variables linearly related to e-commerce sales. The proposed methodology allows specifying a regression model in order to quantify the relevance between openly available data (economic and demographic) and national e-commerce sales. The proposed methodology consists in collecting data, preselecting input variables, performing regressions for choosing variables and models, testing and validating. The usefulness of the proposed approach is twofold: on the one hand, it allows identifying the variables that influence e- commerce sales with an accessible approach. And on the other hand, it can be used to model future sales from the input variables. Results show that e-commerce is linearly dependent on 11 economic and demographic indicators.Keywords: e-commerce, statistical modeling, regression, empirical research
Procedia PDF Downloads 2269984 Fabrication and Characterization of Dissolvable Microneedle Patches Using Different Compositions and Ratios of Hyaluronic Acid and Zinc Oxide Nanoparticles
Authors: Dada Kolawole Segun
Abstract:
Transdermal drug delivery has gained popularity as a non-invasive method for controlled drug release compared to traditional delivery routes. Dissolvable transdermal patches have emerged as a promising platform for delivering a variety of drugs due to their ease of use. The objective of this research was to create and characterize dissolvable transdermal patches using various compositions and ratios of hyaluronic acid and zinc oxide nanoparticles. A micromolding technique was utilized to fabricate the patches, which were subsequently characterized using scanning electron microscopy, atomic force microscopy, and tensile strength testing. In vitro drug release studies were conducted to evaluate the drug release kinetics of the patches. The study found that the mechanical strength and dissolution properties of the patches were influenced by the hyaluronic acid and zinc oxide nanoparticle ratios used in the fabrication process. Moreover, the patches demonstrated controlled delivery of model drugs through the skin, highlighting their potential for transdermal drug delivery applications. The results suggest that dissolvable transdermal patches can be tailored to meet specific requirements for drug delivery applications using different compositions and ratios of hyaluronic acid and zinc oxide nanoparticles. This development has the potential to improve treatment outcomes and patient compliance in various therapeutic areas.Keywords: transdermal drug delivery, characterization, skin permeation, biodegradable materials
Procedia PDF Downloads 909983 Compensatory Neuro-Fuzzy Inference (CNFI) Controller for Bilateral Teleoperation
Abstract:
This paper presents a new adaptive neuro-fuzzy controller equipped with compensatory fuzzy control (CNFI) in order to not only adjusts membership functions but also to optimize the adaptive reasoning by using a compensatory learning algorithm. The proposed control structure includes both CNFI controllers for which one is used to control in force the master robot and the second one for controlling in position the slave robot. The experimental results obtained, show a fairly high accuracy in terms of position and force tracking under free space motion and hard contact motion, what highlights the effectiveness of the proposed controllers.Keywords: compensatory fuzzy, neuro-fuzzy, control adaptive, teleoperation
Procedia PDF Downloads 3249982 Nanoparticles Modification by Grafting Strategies for the Development of Hybrid Nanocomposites
Authors: Irati Barandiaran, Xabier Velasco-Iza, Galder Kortaberria
Abstract:
Hybrid inorganic/organic nanostructured materials based on block copolymers are of considerable interest in the field of Nanotechnology, taking into account that these nanocomposites combine the properties of polymer matrix and the unique properties of the added nanoparticles. The use of block copolymers as templates offers the opportunity to control the size and the distribution of inorganic nanoparticles. This research is focused on the surface modification of inorganic nanoparticles to reach a good interface between nanoparticles and polymer matrices which hinders the nanoparticle aggregation. The aim of this work is to obtain a good and selective dispersion of Fe3O4 magnetic nanoparticles into different types of block copolymers such us, poly(styrene-b-methyl methacrylate) (PS-b-PMMA), poly(styrene-b-ε-caprolactone) (PS-b-PCL) poly(isoprene-b-methyl methacrylate) (PI-b-PMMA) or poly(styrene-b-butadiene-b-methyl methacrylate) (SBM) by using different grafting strategies. Fe3O4 magnetic nanoparticles have been surface-modified with polymer or block copolymer brushes following different grafting methods (grafting to, grafting from and grafting through) to achieve a selective location of nanoparticles into desired domains of the block copolymers. Morphology of fabricated hybrid nanocomposites was studied by means of atomic force microscopy (AFM) and with the aim to reach well-ordered nanostructured composites different annealing methods were used. Additionally, nanoparticle amount has been also varied in order to investigate the effect of the nanoparticle content in the morphology of the block copolymer. Nowadays different characterization methods were using in order to investigate magnetic properties of nanometer-scale electronic devices. Particularly, two different techniques have been used with the aim of characterizing synthesized nanocomposites. First, magnetic force microscopy (MFM) was used to investigate qualitatively the magnetic properties taking into account that this technique allows distinguishing magnetic domains on the sample surface. On the other hand, magnetic characterization by vibrating sample magnetometer and superconducting quantum interference device. This technique demonstrated that magnetic properties of nanoparticles have been transferred to the nanocomposites, exhibiting superparamagnetic behavior similar to that of the maghemite nanoparticles at room temperature. Obtained advanced nanostructured materials could found possible applications in the field of dye-sensitized solar cells and electronic nanodevices.Keywords: atomic force microscopy, block copolymers, grafting techniques, iron oxide nanoparticles
Procedia PDF Downloads 2629981 Women Unemployment in India: Comparative Analysis of Indian States Having Low and High Women Participation in Labour Force
Authors: Anesha Atul Shende
Abstract:
When we are aiming at high goals for economic development, such as sustainable growth and development of the economy, poverty reduction, reduction in inequality, etc., we must not forget to include each and everyone in the society in the process of achieving these goals. This study particularly talks about women participation in economic activities. The analysis is primarily done with a special focus on Indian states. The study analyses the female labour force participation rate in all many states in India. It makes a comparison between the states having low female Labour force participation with the states that have comparatively high female Labour population. In the beginning, data has been provided to know the current state of gender biases in employment. It has been found that the male workforce is dominant all across India. Further, the study highlights the major reasons for low women participation in economic activities in some of the backward states in India like Bihar, etc. These reasons basically talk about economic, cultural, and social factors that are responsible for women unemployment. Afterward, it analyses the reasons behind comparatively higher women participation in all other states in India. The case of the north-eastern state of Telangana and Tamil Nadu have been analysed in brief. These states show the improvements in female Labour participation over a few decades. This is because of government policies that have been adopted, women-friendly workplaces, availability of quality jobs for women, etc. Organization like women UN has recognized the social and economic benefits of having active women Labour force in the country. If women unemployment declines, it will improve the growth rate of the nation as well as the welfare of the society. The study discusses the reasons why an economy must try to increase women workforce participation. It further provides suggestions to improve the conditions in backward states in India, where the female unemployment rate is high. One must understand that policy interventions and government schemes are a few of the ways to recognize this issue and work on it. However, the conditions will improve only when the changes would happen from the ground level with social and moral support to the women.Keywords: women unemployment, labour force participation, women empowerment, economic growth and development, gender disparity
Procedia PDF Downloads 83