Search results for: tanner EDA tool write access time and retention time
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 8475

Search results for: tanner EDA tool write access time and retention time

8385 WormHex: A Volatile Memory Analysis Tool for Retrieval of Social Media Evidence

Authors: Norah Almubairik, Wadha Almattar, Amani Alqarni

Abstract:

Social media applications are increasingly being used in our everyday communications. These applications utilise end-to-end encryption mechanisms which make them suitable tools for criminals to exchange messages. These messages are preserved in the volatile memory until the device is restarted. Therefore, volatile forensics has become an important branch of digital forensics. In this study, the WormHex tool was developed to inspect the memory dump files for Windows and Mac based workstations. The tool supports digital investigators by enabling them to extract valuable data written in Arabic and English through web-based WhatsApp and Twitter applications. The results confirm that social media applications write their data into the memory, regardless of the operating system running the application, with there being no major differences between Windows and Mac.

Keywords: Volatile memory, REGEX, digital forensics, memory acquisition

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 841
8384 Finding Pareto Optimal Front for the Multi-Mode Time, Cost Quality Trade-off in Project Scheduling

Authors: H. Iranmanesh, M. R. Skandari, M. Allahverdiloo

Abstract:

Project managers are the ultimate responsible for the overall characteristics of a project, i.e. they should deliver the project on time with minimum cost and with maximum quality. It is vital for any manager to decide a trade-off between these conflicting objectives and they will be benefited of any scientific decision support tool. Our work will try to determine optimal solutions (rather than a single optimal solution) from which the project manager will select his desirable choice to run the project. In this paper, the problem in project scheduling notated as (1,T|cpm,disc,mu|curve:quality,time,cost) will be studied. The problem is multi-objective and the purpose is finding the Pareto optimal front of time, cost and quality of a project (curve:quality,time,cost), whose activities belong to a start to finish activity relationship network (cpm) and they can be done in different possible modes (mu) which are non-continuous or discrete (disc), and each mode has a different cost, time and quality . The project is constrained to a non-renewable resource i.e. money (1,T). Because the problem is NP-Hard, to solve the problem, a meta-heuristic is developed based on a version of genetic algorithm specially adapted to solve multi-objective problems namely FastPGA. A sample project with 30 activities is generated and then solved by the proposed method.

Keywords: FastPGA, Multi-Execution Activity Mode, ParetoOptimality, Project Scheduling, Time-Cost-Quality Trade-Off.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1639
8383 Finding Pareto Optimal Front for the Multi- Mode Time, Cost Quality Trade-off in Project Scheduling

Authors: H. Iranmanesh, M. R. Skandari, M. Allahverdiloo

Abstract:

Project managers are the ultimate responsible for the overall characteristics of a project, i.e. they should deliver the project on time with minimum cost and with maximum quality. It is vital for any manager to decide a trade-off between these conflicting objectives and they will be benefited of any scientific decision support tool. Our work will try to determine optimal solutions (rather than a single optimal solution) from which the project manager will select his desirable choice to run the project. In this paper, the problem in project scheduling notated as (1,T|cpm,disc,mu|curve:quality,time,cost) will be studied. The problem is multi-objective and the purpose is finding the Pareto optimal front of time, cost and quality of a project (curve:quality,time,cost), whose activities belong to a start to finish activity relationship network (cpm) and they can be done in different possible modes (mu) which are non-continuous or discrete (disc), and each mode has a different cost, time and quality . The project is constrained to a non-renewable resource i.e. money (1,T). Because the problem is NP-Hard, to solve the problem, a meta-heuristic is developed based on a version of genetic algorithm specially adapted to solve multi-objective problems namely FastPGA. A sample project with 30 activities is generated and then solved by the proposed method.

Keywords: FastPGA, Multi-Execution Activity Mode, Pareto Optimality, Project Scheduling, Time-Cost-Quality Trade-Off.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1759
8382 Performance Evaluation of One and Two Dimensional Prime Codes for Optical Code Division Multiple Access Systems

Authors: Gurjit Kaur, Neena Gupta

Abstract:

In this paper, we have analyzed and compared the performance of various coding schemes. The basic ID prime sequence codes are unique in only dimension, i.e. time slots, whereas 2D coding techniques are not unique by their time slots but with their wavelengths also. In this research, we have evaluated and compared the performance of 1D and 2D coding techniques constructed using prime sequence coding pattern for Optical Code Division Multiple Access (OCDMA) system on a single platform. Analysis shows that 2D prime code supports lesser number of active users than 1D codes, but they are having large code family and are the most secure codes compared to other codes. The performance of all these codes is analyzed on basis of number of active users supported at a Bit Error Rate (BER) of 10-9.

Keywords: CDMA, OCDMA, BER, OOC, PC, EPC, MPC, 2-D PC/PC, λc, λa.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1079
8381 Effect of Temperature on Specific Retention Volumes of Selected Volatile Organic Compounds Using the Gas - Liquid Chromatographic Technique Revisited

Authors: Edison Muzenda, Ayo S. Afolabi

Abstract:

This paper is a continuation of our interest in the influence of temperature on specific retention volumes and the resulting infinite dilution activity coefficients. This has a direct effect in the design of absorption and stripping columns for the abatement of volatile organic compounds. The interaction of 13 volatile organic compounds (VOCs) with polydimethylsiloxane (PDMS) at varying temperatures was studied by gas liquid chromatography (GLC). Infinite dilution activity coefficients and specific retention volumes obtained in this study were found to be in agreement with those obtained from static headspace and group contribution methods by the authors as well as literature values for similar systems. Temperature variation also allows for transport calculations for different seasons. The results of this work confirm that PDMS is well suited for the scrubbing of VOCs from waste gas streams. Plots of specific retention volumes against temperature gave linear van-t Hoff plots.

Keywords: Specific retention volume, Waste gas streams, specific retention, infinite dilution, abatement, transport.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1914
8380 A CTL Specification of Serializability for Transactions Accessing Uniform Data

Authors: Rafat Alshorman, Walter Hussak

Abstract:

Existing work in temporal logic on representing the execution of infinitely many transactions, uses linear-time temporal logic (LTL) and only models two-step transactions. In this paper, we use the comparatively efficient branching-time computational tree logic CTL and extend the transaction model to a class of multistep transactions, by introducing distinguished propositional variables to represent the read and write steps of n multi-step transactions accessing m data items infinitely many times. We prove that the well known correspondence between acyclicity of conflict graphs and serializability for finite schedules, extends to infinite schedules. Furthermore, in the case of transactions accessing the same set of data items in (possibly) different orders, serializability corresponds to the absence of cycles of length two. This result is used to give an efficient encoding of the serializability condition into CTL.

Keywords: computational tree logic, serializability, multi-step transactions.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1128
8379 Decontamination of Cr(VI) Polluted Wastewater by use of Low Cost Industrial Wastes

Authors: Marius Gheju, Rodica Pode

Abstract:

The reduction of hexavalent chromium by scrap iron was investigated in continuous system, using long-term column experiments, for aqueous Cr(VI) solutions having low buffering capacities, over the Cr(VI) concentration range of 5 – 40 mg/L. The results showed that the initial Cr(VI) concentration significantly affects the reduction capacity of scrap iron. Maximum reduction capacity of scrap iron was observed at the beginning of the column experiments; the lower the Cr(VI) concentration, the greater the experiment duration with maximum scrap iron reduction capacity. However, due to passivation of active surface, scrap iron reduction capacity continuously decreased in time, especially after Cr(VI) breakthrough. The experimental results showed that highest reduction capacity recorded until Cr(VI) breakthrough was 22.8 mg Cr(VI)/g scrap iron, at CI = 5 mg/L, and decreased with increasing Cr(VI) concentration. In order to assure total reduction of greater Cr(VI) concentrations for a longer period of time, either the mass of scrap iron filling, or the hydraulic retention time should be increased.

Keywords: hexavalent chromium, heavy metals, scrap iron, reduction capacity, wastewater treatment.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1796
8378 An Automatic Tool for Checking Consistency between Data Flow Diagrams (DFDs)

Authors: Rosziati Ibrahim, Siow Yen Yen

Abstract:

System development life cycle (SDLC) is a process uses during the development of any system. SDLC consists of four main phases: analysis, design, implement and testing. During analysis phase, context diagram and data flow diagrams are used to produce the process model of a system. A consistency of the context diagram to lower-level data flow diagrams is very important in smoothing up developing process of a system. However, manual consistency check from context diagram to lower-level data flow diagrams by using a checklist is time-consuming process. At the same time, the limitation of human ability to validate the errors is one of the factors that influence the correctness and balancing of the diagrams. This paper presents a tool that automates the consistency check between Data Flow Diagrams (DFDs) based on the rules of DFDs. The tool serves two purposes: as an editor to draw the diagrams and as a checker to check the correctness of the diagrams drawn. The consistency check from context diagram to lower-level data flow diagrams is embedded inside the tool to overcome the manual checking problem.

Keywords: Data Flow Diagram, Context Diagram, ConsistencyCheck, Syntax and Semantic Rules

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3401
8377 On-Line Geometrical Identification of Reconfigurable Machine Tool using Virtual Machining

Authors: Alexandru Epureanu, Virgil Teodor

Abstract:

One of the main research directions in CAD/CAM machining area is the reducing of machining time. The feedrate scheduling is one of the advanced techniques that allows keeping constant the uncut chip area and as sequel to keep constant the main cutting force. They are two main ways for feedrate optimization. The first consists in the cutting force monitoring, which presumes to use complex equipment for the force measurement and after this, to set the feedrate regarding the cutting force variation. The second way is to optimize the feedrate by keeping constant the material removal rate regarding the cutting conditions. In this paper there is proposed a new approach using an extended database that replaces the system model. The feedrate scheduling is determined based on the identification of the reconfigurable machine tool, and the feed value determination regarding the uncut chip section area, the contact length between tool and blank and also regarding the geometrical roughness. The first stage consists in the blank and tool monitoring for the determination of actual profiles. The next stage is the determination of programmed tool path that allows obtaining the piece target profile. The graphic representation environment models the tool and blank regions and, after this, the tool model is positioned regarding the blank model according to the programmed tool path. For each of these positions the geometrical roughness value, the uncut chip area and the contact length between tool and blank are calculated. Each of these parameters are compared with the admissible values and according to the result the feed value is established. We can consider that this approach has the following advantages: in case of complex cutting processes the prediction of cutting force is possible; there is considered the real cutting profile which has deviations from the theoretical profile; the blank-tool contact length limitation is possible; it is possible to correct the programmed tool path so that the target profile can be obtained. Applying this method, there are obtained data sets which allow the feedrate scheduling so that the uncut chip area is constant and, as a result, the cutting force is constant, which allows to use more efficiently the machine tool and to obtain the reduction of machining time.

Keywords: Reconfigurable machine tool, system identification, uncut chip area, cutting conditions scheduling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1411
8376 What the Future Holds for Social Media Data Analysis

Authors: P. Wlodarczak, J. Soar, M. Ally

Abstract:

The dramatic rise in the use of Social Media (SM) platforms such as Facebook and Twitter provide access to an unprecedented amount of user data. Users may post reviews on products and services they bought, write about their interests, share ideas or give their opinions and views on political issues. There is a growing interest in the analysis of SM data from organisations for detecting new trends, obtaining user opinions on their products and services or finding out about their online reputations. A recent research trend in SM analysis is making predictions based on sentiment analysis of SM. Often indicators of historic SM data are represented as time series and correlated with a variety of real world phenomena like the outcome of elections, the development of financial indicators, box office revenue and disease outbreaks. This paper examines the current state of research in the area of SM mining and predictive analysis and gives an overview of the analysis methods using opinion mining and machine learning techniques.

Keywords: Social Media, text mining, knowledge discovery, predictive analysis, machine learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3794
8375 A DEA Model for Performance Evaluation in The Presence of Time Lag Effect

Authors: Yanshuang Zhang, Byungho Jeong

Abstract:

Data Envelopment Analysis (DEA) is a methodology that computes efficiency values for decision making units (DMU) in a given period by comparing the outputs with the inputs. In many cases, there are some time lag between the consumption of inputs and the production of outputs. For a long-term research project, it is hard to avoid the production lead time phenomenon. This time lag effect should be considered in evaluating the performance of organizations. This paper suggests a model to calculate efficiency values for the performance evaluation problem with time lag. In the experimental part, the proposed methods are compared with the CCR and an existing time lag model using the data set of the 21st century frontier R&D program which is a long-term national R&D program of Korea.

Keywords: DEA, Efficiency, Time Lag

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1846
8374 Forecasting Enrollment Model Based on First-Order Fuzzy Time Series

Authors: Melike Şah, Konstantin Y.Degtiarev

Abstract:

This paper proposes a novel improvement of forecasting approach based on using time-invariant fuzzy time series. In contrast to traditional forecasting methods, fuzzy time series can be also applied to problems, in which historical data are linguistic values. It is shown that proposed time-invariant method improves the performance of forecasting process. Further, the effect of using different number of fuzzy sets is tested as well. As with the most of cited papers, historical enrollment of the University of Alabama is used in this study to illustrate the forecasting process. Subsequently, the performance of the proposed method is compared with existing fuzzy time series time-invariant models based on forecasting accuracy. It reveals a certain performance superiority of the proposed method over methods described in the literature.

Keywords: Forecasting, fuzzy time series, linguistic values, student enrollment, time-invariant model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2183
8373 Finite-time Stability Analysis of Fractional-order with Multi-state Time Delay

Authors: Liqiong Liu, Shouming Zhong

Abstract:

In this paper, the finite-time stabilization of a class of multi-state time delay of fractional-order system is proposed. First, we define finite-time stability with the fractional-order system. Second, by using Generalized Gronwall's approach and the methods of the inequality, we get some conditions of finite-time stability for the fractional system with multi-state delay. Finally, a numerical example is given to illustrate the result.

Keywords: Finite-time stabilization, fractional-order system, Gronwall inequality.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1854
8372 A Visualized Framework for Representing Uncertain and Incomplete Temporal Knowledge

Authors: Yue Wang, Jixin Ma, Brian Knight

Abstract:

This paper presents a visualized computer aided case tool for non-expert, called Visual Time, for representing and reasoning about incomplete and uncertain temporal information. It is both expressive and versatile, allowing logical conjunctions and disjunctions of both absolute and relative temporal relations, such as “Before”, “Meets”, “Overlaps”, “Starts”, “During”, and “Finishes”, etc. In terms of a visualized framework, Visual Time provides a user-friendly environment for describing scenarios with rich temporal structure in natural language, which can be formatted as structured temporal phrases and modeled in terms of Temporal Relationship Diagrams (TRD). A TRD can be automatically and visually transformed into a corresponding Time Graph, supported by automatic consistency checker that derives a verdict to confirm if a given scenario is temporally consistent or inconsistent.

Keywords: Time Visualization, Uncertainty, Incompleteness, Consistency Checking.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1461
8371 Visual Inspection of Work Piece with a Complex Shape by Means of Robot Manipulator

Authors: A. Y. Bani Hashim, N. S. A. Ramdan

Abstract:

Inconsistency in manual inspection is real because humans get tired after some time. Recent trends show that automatic inspection is more appealing for mass production inspections. In such as a case, a robot manipulator seems the best candidate to run a dynamic visual inspection. The purpose of this work is to estimate the optimum workspace where a robot manipulator would perform a visual inspection process onto a work piece where a camera is attached to the end effector. The pseudo codes for the planned path are derived from the number of tool transit points, the delay time at the transit points, the process cycle time, and the configuration space that the distance between the tool and the work piece. It is observed that express start and swift end are acceptable in a robot program because applicable works usually in existence during these moments. However, during the mid-range cycle, there are always practical tasks programmed to be executed. For that reason, it is acceptable to program the robot such as that speedy alteration of actuator displacement is avoided. A dynamic visual inspection system using a robot manipulator seems practical for a work piece with a complex shape.

Keywords: Robot manipulator, Visual inspection, Work piece, Trajectory planning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1620
8370 A Failure Analysis Tool for HDD Analysis

Authors: C. Kumjeera, T. Unchim, B. Marungsri, A. Oonsivilai

Abstract:

The study of piezoelectric material in the past was in T-Domain form; however, no one has studied piezoelectric material in the S-Domain form. This paper will present the piezoelectric material in the transfer function or S-Domain model. S-Domain is a well known mathematical model, used for analyzing the stability of the material and determining the stability limits. By using S-Domain in testing stability of piezoelectric material, it will provide a new tool for the scientific world to study this material in various forms.

Keywords: Hard disk drive, failure analysis, tool, time

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2698
8369 C-LNRD: A Cross-Layered Neighbor Route Discovery for Effective Packet Communication in Wireless Sensor Network

Authors: K. Kalaikumar, E. Baburaj

Abstract:

One of the problems to be addressed in wireless sensor networks is the issues related to cross layer communication. Cross layer architecture shares the information across the layer, ensuring Quality of Services (QoS). With this shared information, MAC protocol adapts effective functionality maintenance such as route selection on changeable sensor network environment. However, time slot assignment and neighbour route selection time duration for cross layer have not been carried out. The time varying physical layer communication over cross layer causes high traffic load in the sensor network. Though, the traffic load was reduced using cross layer optimization procedure, the computational cost is high. To improve communication efficacy in the sensor network, a self-determined time slot based Cross-Layered Neighbour Route Discovery (C-LNRD) method is presented in this paper. In the presented work, the initial process is to discover the route in the sensor network using Dynamic Source Routing based Medium Access Control (MAC) sub layers. This process considers MAC layer operation with dynamic route neighbour table discovery. Then, the discovered route path for packet communication employs Broad Route Distributed Time Slot Assignment method on Cross-Layered Sensor Network system. Broad Route means time slotting on varying length of the route paths. During packet communication in this sensor network, transmission of packets is adjusted over the different time with varying ranges for controlling the traffic rate. Finally, Rayleigh fading model is developed in C-LNRD to identify the performance of the sensor network communication structure. The main task of Rayleigh Fading is to measure the power level of each communication under MAC sub layer. The minimized power level helps to easily reduce the computational cost of packet communication in the sensor network. Experiments are conducted on factors such as power factor, on packet communication, neighbour route discovery time, and information (i.e., packet) propagation speed.

Keywords: Medium access control, neighbour route discovery, wireless sensor network, Rayleigh fading, distributed time slot assignment

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 725
8368 An IM-COH Algorithm Neural Network Optimization with Cuckoo Search Algorithm for Time Series Samples

Authors: Wullapa Wongsinlatam

Abstract:

Back propagation algorithm (BP) is a widely used technique in artificial neural network and has been used as a tool for solving the time series problems, such as decreasing training time, maximizing the ability to fall into local minima, and optimizing sensitivity of the initial weights and bias. This paper proposes an improvement of a BP technique which is called IM-COH algorithm (IM-COH). By combining IM-COH algorithm with cuckoo search algorithm (CS), the result is cuckoo search improved control output hidden layer algorithm (CS-IM-COH). This new algorithm has a better ability in optimizing sensitivity of the initial weights and bias than the original BP algorithm. In this research, the algorithm of CS-IM-COH is compared with the original BP, the IM-COH, and the original BP with CS (CS-BP). Furthermore, the selected benchmarks, four time series samples, are shown in this research for illustration. The research shows that the CS-IM-COH algorithm give the best forecasting results compared with the selected samples.

Keywords: Artificial neural networks, back propagation algorithm, time series, local minima problem, metaheuristic optimization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1034
8367 Florida’s Groundwater and Surface Water System Reliability in Terms of Climate Change and Sea-Level Rise

Authors: Rahman Davtalab, Saba Ghotbi

Abstract:

Florida is one of the most vulnerable states to natural disasters among the 50 states of the USA. The state exposed by tropical storms, hurricanes, storm surge, landslide, etc. Besides the mentioned natural phenomena, global warming, sea-level rise, and other anthropogenic environmental changes make a very complicated and unpredictable system for decision-makers. In this study, we tried to highlight the effects of climate change and sea-level rise on surface water and groundwater systems for three different geographical locations in Florida; Main Canal of Jacksonville Beach in the northeast of Florida adjacent to the Atlantic Ocean, Grace Lake in central Florida, far away from surrounded coastal line, and Mc Dill in Florida and adjacent to Tampa Bay and Mexican Gulf. An integrated hydrologic and hydraulic model was developed and simulated for all three cases, including surface water, groundwater, or a combination of both. For the case study of Main Canal-Jacksonville Beach, the investigation showed that a 76 cm sea-level rise in time horizon 2060 could increase the flow velocity of the tide cycle for the main canal's outlet and headwater. This case also revealed how the sea level rise could change the tide duration, potentially affecting the coastal ecosystem. As expected, sea-level rise can raise the groundwater level. Therefore, for the Mc Dill case, the effect of groundwater rise on soil storage and the performance of stormwater retention ponds is investigated. The study showed that sea-level rise increased the pond’s seasonal high water up to 40 cm by time horizon 2060. The reliability of the retention pond is dropped from 99% for the current condition to 54% for the future. The results also proved that the retention pond could not retain and infiltrate the designed treatment volume within 72 hours, which is a significant indication of increasing pollutants in the future. Grace Lake case study investigates the effects of climate change on groundwater recharge. This study showed that using the dynamically downscaled data of the groundwater recharge can decline up to 24 % by the mid-21st century. 

Keywords: groundwater, surface water, Florida, retention pond, tide, sea-level rise

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 513
8366 System for Monitoring Marine Turtles Using Unstructured Supplementary Service Data

Authors: Luís Pina

Abstract:

The conservation of marine biodiversity keeps ecosystems in balance and ensures the sustainable use of resources. In this context, technological resources have been used for monitoring marine species to allow biologists to obtain data in real-time. There are different mobile applications developed for data collection for monitoring purposes, but these systems are designed to be utilized only on third-generation (3G) phones or smartphones with Internet access and in rural parts of the developing countries, Internet services and smartphones are scarce. Thus, the objective of this work is to develop a system to monitor marine turtles using Unstructured Supplementary Service Data (USSD), which users can access through basic mobile phones. The system aims to improve the data collection mechanism and enhance the effectiveness of current systems in monitoring sea turtles using any type of mobile device without Internet access. The system will be able to report information related to the biological activities of marine turtles. Also, it will be used as a platform to assist marine conservation entities to receive reports of illegal sales of sea turtles. The system can also be utilized as an educational tool for communities, providing knowledge and allowing the inclusion of communities in the process of monitoring marine turtles. Therefore, this work may contribute with information to decision-making and implementation of contingency plans for marine conservation programs.

Keywords: GSM, marine biology, marine turtles, USSD.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 875
8365 A Mark-Up Approach to Add Value

Authors: Ivaylo I. Atanasov, Evelina N.Pencheva

Abstract:

This paper presents a mark-up approach to service creation in Next Generation Networks. The approach allows deriving added value from network functions exposed by Parlay/OSA (Open Service Access) interfaces. With OSA interfaces service logic scripts might be executed both on callrelated and call-unrelated events. To illustrate the approach XMLbased language constructions for data and method definitions, flow control, time measuring and supervision and database access are given and an example of OSA application is considered.

Keywords: Service creation, mark-up approach.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1642
8364 An Experimental Study on the Effect of Operating Parameters during the Micro-Electro-Discharge Machining of Ni Based Alloy

Authors: Asma Perveen, M. P. Jahan

Abstract:

Ni alloys have managed to cover wide range of applications such as automotive industries, oil gas industries, and aerospace industries. However, these alloys impose challenges while using conventional machining technologies. On the other hand, Micro-Electro-Discharge machining (micro-EDM) is a non-conventional machining method that uses controlled sparks energy to remove material irrespective of the materials hardness. There has been always a huge interest from the industries for developing optimum methodology and parameters in order to enhance the productivity of micro-EDM in terms of reducing machining time and tool wear for different alloys. Therefore, the aims of this study are to investigate the effects of the micro-EDM process parameters, in order to find their optimal values. The input process parameters include voltage, capacitance, and electrode rotational speed, whereas the output parameters considered are machining time, entrance diameter of hole, overcut, tool wear, and crater size. The surface morphology and element characterization are also investigated with the use of SEM and EDX analysis. The experimental result indicates the reduction of machining time with the increment of discharge energy. Discharge energy also contributes to the enlargement of entrance diameter as well as overcut. In addition, tool wears show reduction with the increase of discharge energy. Moreover, crater size is found to be increased in size along with the increment of discharge energy.

Keywords: Micro EDM, Ni alloy, discharge energy, micro-holes.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1290
8363 Surgery Scheduling Using Simulation with Arena

Authors: J. A. López, C.I. López, J.E. Olguín, C. Camargo, J. M. López

Abstract:

The institutions seek to improve their performance and quality of service, so that their patients are satisfied. This research project aims, conduct a time study program in the area of gynecological surgery, to determine the current level of capacity and optimize the programming time in order to adequately respond to demand. The system is analyzed by waiting lines and uses the simulation using ARENA to evaluate proposals for improvement and optimization programming time each of the surgeries.

Keywords: Time study, waiting lines, reducing time, simulation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2708
8362 How to Use E-Learning to Increase Job Satisfaction in Large Commercial Bank in Bangkok

Authors: Teerada Apibunyopas, Nithinant Thammakoranonta

Abstract:

Many organizations bring e-Learning to use as a tool in their training and human development department. It is getting more popular because it is easy to access to get knowledge all the time and also it provides a rich content, which can develop the employees’ skill efficiently. This study is focused on the factors that affect using e-Learning efficiently, so it will make job satisfaction increasing. The questionnaires were sent to employees in large commercial banks, which use e-Learning located in Bangkok, the results from multiple linear regression analysis showed that employee’s characteristics, characteristics of e-Learning, learning and growth have influence on job satisfaction.

Keywords: e-Learning, Job Satisfaction, Learning and growth.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2344
8361 Numerical Analysis and Experimental Validation of a Downhole Stress/Strain Measurement Tool

Authors: Abhay Bodake, Ping Sui, Hafeez Syed, Ratish Kadam

Abstract:

Real-time measurement of applied forces, like tension, compression, torsion, and bending moment, identifies the transferred energies being applied to the bottomhole assembly (BHA). These forces are highly detrimental to measurement/logging-while-drilling tools and downhole equipment. Real-time measurement of the dynamic downhole behavior, including weight, torque, bending on bit, and vibration, establishes a real-time feedback loop between the downhole drilling system and drilling team at the surface. This paper describes the numerical analysis of the strain data acquired by the measurement tool at different locations on the strain pockets. The strain values obtained by FEA for various loading conditions (tension, compression, torque, and bending moment) are compared against experimental results obtained from an identical experimental setup. Numerical analyses results agree with experimental data within 8% and, therefore, substantiate and validate the FEA model. This FEA model can be used to analyze the combined loading conditions that reflect the actual drilling environment.

Keywords: FEA, M/LWD, Oil & Gas, Strain Measurement.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2538
8360 Low Cost Real-Time Communication Braille Hand-Glove for Visually Impaired Using Slot Sensors and Vibration Motors

Authors: Mukul Bandodkar, Virat Chourasia

Abstract:

Visually impaired people find it extremely difficult to acquire basic and vital information necessary for their living. Therefore, they are at a very high risk of being socially excluded as a result of poor access to information. In recent years, several attempts have been made in improving the communication methods for visually impaired people which involve tactile sensation such as finger Braille, manual alphabets and the print on palm method and several other electronic devices. But, there are some problems which arise in such methods such as lack of privacy and lack of compatibility to computer environment. This paper describes a low cost Braille hand glove for blind people using slot sensors and vibration motors with the help of which they can read and write emails, text messages and read e-books. This glove allows the person to type characters based on different Braille combination using six slot sensors. The vibration in six different positions of the glove which matches to the Braille code allows them to read characters.

Keywords: Braille, Braille Hand-Glove, Slot sensors, Vibration motors.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4135
8359 An Investigation into Kanji Character Discrimination Process from EEG Signals

Authors: Hiroshi Abe, Minoru Nakayama

Abstract:

The frontal area in the brain is known to be involved in behavioral judgement. Because a Kanji character can be discriminated visually and linguistically from other characters, in Kanji character discrimination, we hypothesized that frontal event-related potential (ERP) waveforms reflect two discrimination processes in separate time periods: one based on visual analysis and the other based on lexcical access. To examine this hypothesis, we recorded ERPs while performing a Kanji lexical decision task. In this task, either a known Kanji character, an unknown Kanji character or a symbol was presented and the subject had to report if the presented character was a known Kanji character for the subject or not. The same response was required for unknown Kanji trials and symbol trials. As a preprocessing of signals, we examined the performance of a method using independent component analysis for artifact rejection and found it was effective. Therefore we used it. In the ERP results, there were two time periods in which the frontal ERP wavefoms were significantly different betweeen the unknown Kanji trials and the symbol trials: around 170ms and around 300ms after stimulus onset. This result supported our hypothesis. In addition, the result suggests that Kanji character lexical access may be fully completed by around 260ms after stimulus onset.

Keywords: Character discrimination, Event-related Potential, IndependentComponent Analysis, Kanji, Lexical access.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1747
8358 A Study of Lean Principles Implementation in the Libyan Healthcare and Industry Sectors

Authors: Nasser M. Amaitik, Ngwan F. Elsagzli

Abstract:

Lean technique is very important in the service and industrial fields. It is defined as an effective tool to eliminate the wastes. In lean the wastes are defined as anything which does not add value to the end product. There are wastes that can be avoided, but some are unavoidable for many reasons.    

The present study aims to apply the principles of lean in two different sectors, healthcare and industry. Two case studies have been selected to apply the experimental work. The first case was Al-Jalaa Hospital, while the second case study was the Technical Company of Aluminum Sections in Benghazi, LIBYA. In both case studies the Value Stream Map (VSM) of the current state has been constructed. The proposed plans have been implemented by merging or eliminating procedures or processes.

The results obtained from both case studies showed improvement in Capacity, Idle time and Utilized time.

Keywords: Healthcare service delivery, Idle time, Lean principles, Utilized time, Value stream mapping, Wastes.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2274
8357 Finite Time Symplectic Synchronization between Two Different Chaotic Systems

Authors: Chunming Xu

Abstract:

In this paper, the finite-time symplectic synchronization between two different chaotic systems is investigated. Based on the finite-time stability theory, a simple adaptive feedback scheme is proposed to realize finite-time symplectic synchronization for the Lorenz and L¨u systems. Numerical examples are provided to show the effectiveness of the proposed method.

Keywords: Chaotic systems, symplectic synchronization, finite-time synchronization, adaptive controller.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 911
8356 Discovery of Time Series Event Patterns based on Time Constraints from Textual Data

Authors: Shigeaki Sakurai, Ken Ueno, Ryohei Orihara

Abstract:

This paper proposes a method that discovers time series event patterns from textual data with time information. The patterns are composed of sequences of events and each event is extracted from the textual data, where an event is characteristic content included in the textual data such as a company name, an action, and an impression of a customer. The method introduces 7 types of time constraints based on the analysis of the textual data. The method also evaluates these constraints when the frequency of a time series event pattern is calculated. We can flexibly define the time constraints for interesting combinations of events and can discover valid time series event patterns which satisfy these conditions. The paper applies the method to daily business reports collected by a sales force automation system and verifies its effectiveness through numerical experiments.

Keywords: Text mining, sequential mining, time constraints, daily business reports.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1441