Search results for: Software testability
1473 Design and Application of NFC-Based Identity and Access Management in Cloud Services
Authors: Shin-Jer Yang, Kai-Tai Yang
Abstract:
In response to a changing world and the fast growth of the Internet, more and more enterprises are replacing web-based services with cloud-based ones. Multi-tenancy technology is becoming more important especially with Software as a Service (SaaS). This in turn leads to a greater focus on the application of Identity and Access Management (IAM). Conventional Near-Field Communication (NFC) based verification relies on a computer browser and a card reader to access an NFC tag. This type of verification does not support mobile device login and user-based access management functions. This study designs an NFC-based third-party cloud identity and access management scheme (NFC-IAM) addressing this shortcoming. Data from simulation tests analyzed with Key Performance Indicators (KPIs) suggest that the NFC-IAM not only takes less time in identity identification but also cuts time by 80% in terms of two-factor authentication and improves verification accuracy to 99.9% or better. In functional performance analyses, NFC-IAM performed better in salability and portability. The NFC-IAM App (Application Software) and back-end system to be developed and deployed in mobile device are to support IAM features and also offers users a more user-friendly experience and stronger security protection. In the future, our NFC-IAM can be employed to different environments including identification for mobile payment systems, permission management for remote equipment monitoring, among other applications.
Keywords: Cloud service, multi-tenancy, NFC, IAM, mobile device.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11181472 Computer Software Applicable in Rehabilitation, Cardiology and Molecular Biology
Authors: P. Kowalska, P. Gabka, K. Kamieniarz, M. Kamieniarz, W. Stryla, P. Guzik, T. Krauze
Abstract:
We have developed a computer program consisting of 6 subtests assessing the children hand dexterity applicable in the rehabilitation medicine. We have carried out a normative study on a representative sample of 285 children aged from 7 to 15 (mean age 11.3) and we have proposed clinical standards for three age groups (7-9, 9-11, 12-15 years). We have shown statistical significance of differences among the corresponding mean values of the task time completion. We have also found a strong correlation between the task time completion and the age of the subjects, as well as we have performed the test-retest reliability checks in the sample of 84 children, giving the high values of the Pearson coefficients for the dominant and non-dominant hand in the range 0.74Keywords: Biomedical data base processing, Computer software, Hand dexterity, Heart rate and blood pressure variability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14751471 Grading Fourteen Zones of Isfahan in Terms of the Impact of Globalization on the Urban Fabric of the City, Using the TOPSIS Model
Authors: A. Zahedi Yeganeh, A. Khademolhosseini, R. Mokhtari Malekabadi
Abstract:
Undoubtedly one of the most far-reaching and controversial topics considered in the past few decades, has been globalization. Globalization lies in the essence of the modern culture. It is a complex and rapidly expanding network of links and mutual interdependence that is an aspect of modern life; though some argue that this link existed since the beginning of human history. If we consider globalization as a dynamic social process in which the geographical constraints governing the political, economic, social and cultural relationships have been undermined, it might not be possible to simply describe its impact on the urban fabric. But since in this phenomenon the increase in communications of societies (while preserving the main cultural - regional characteristics) with one another and the increase in the possibility of influencing other societies are discussed, the need for more studies will be felt. The main objective of this study is to grade based on some globalization factors on urban fabric applying the TOPSIS model. The research method is descriptive - analytical and survey. For data analysis, the TOPSIS model and SPSS software were used and the results of GIS software with fourteen cities are shown on the map. The results show that the process of being influenced by the globalization of the urban fabric of fourteen zones of Isfahan was not similar and there have been large differences in this respect between city zones; the most affected areas are zones 5, 6 and 9 of the municipality and the least impact has been on the zones 4 and 3 and 2.
Keywords: Grading, Globalization, Urban fabric, 14 zones of Isfahan, TOPSIS model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19881470 Comparison of different Channel Modeling Techniques used in the BPLC Systems
Authors: Justinian Anatory, Nelson Theethayi
Abstract:
The paper compares different channel models used for modeling Broadband Power-Line Communication (BPLC) system. The models compared are Zimmermann and Dostert, Philipps, Anatory et al and Anatory et al generalized Transmission Line (TL) model. The validity of each model was compared in time domain with ATP-EMTP software which uses transmission line approach. It is found that for a power-line network with minimum number of branches all the models give similar signal/pulse time responses compared with ATP-EMTP software; however, Zimmermann and Dostert model indicates the same amplitude but different time delay. It is observed that when the numbers of branches are increased only generalized TL theory approach results are comparable with ATPEMTP results. Also the Multi-Carrier Spread Spectrum (MC-SS) system was applied to check the implication of such behavior on the modulation schemes. It is observed that using Philipps on the underground cable can predict the performance up to 25dB better than other channel models which can misread the actual performance of the system. Also modified Zimmermann and Dostert under multipath can predict a better performance of about 5dB better than the actual predicted by Generalized TL theory. It is therefore suggested for a realistic BPLC system design and analyses the model based on generalized TL theory be used.Keywords: Broadband Power line Channel Models, loadimpedance, Branched network.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18261469 Profile Controlled Gold Nanostructures Fabricated by Nanosphere Lithography for Localized Surface Plasmon Resonance
Authors: Xiaodong Zhou, Nan Zhang
Abstract:
Localized surface plasmon resonance (LSPR) is the coherent oscillation of conductive electrons confined in noble metallic nanoparticles excited by electromagnetic radiation, and nanosphere lithography (NSL) is one of the cost-effective methods to fabricate metal nanostructures for LSPR. NSL can be categorized into two major groups: dispersed NSL and closely pack NSL. In recent years, gold nanocrescents and gold nanoholes with vertical sidewalls fabricated by dispersed NSL, and silver nanotriangles and gold nanocaps on silica nanospheres fabricated by closely pack NSL, have been reported for LSPR biosensing. This paper introduces several novel gold nanostructures fabricated by NSL in LSPR applications, including 3D nanostructures obtained by evaporating gold obliquely on dispersed nanospheres, nanoholes with slant sidewalls, and patchy nanoparticles on closely packed nanospheres, all of which render satisfactory sensitivity for LSPR sensing. Since the LSPR spectrum is very sensitive to the shape of the metal nanostructures, formulas are derived and software is developed for calculating the profiles of the obtainable metal nanostructures by NSL, for different nanosphere masks with different fabrication conditions. The simulated profiles coincide well with the profiles of the fabricated gold nanostructures observed under scanning electron microscope (SEM) and atomic force microscope (AFM), which proves that the software is a useful tool for the process design of different LSPR nanostructures.Keywords: Nanosphere lithography, localized surface plasmonresonance, biosensor, simulation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18901468 Basic Research for Electroretinogram Moving the Center of the Multifocal Hexagonal Stimulus Array
Authors: Naoto Suzuki
Abstract:
Many ophthalmologists can examine declines in visual sensitivity at arbitrary points on the retina using a precise perimetry device with a fundus camera function. However, the retinal layer causing the decline in visual sensitivity cannot be identified by this method. We studied an electroretinogram (ERG) function that can move the center of the multifocal hexagonal stimulus array in order to investigate cryptogenic diseases, such as macular dystrophy, acute zonal occult outer retinopathy, and multiple evanescent white dot syndrome. An electroretinographic optical system, specifically a perimetric optical system, was added to an experimental device carrying the same optical system as a fundus camera. We also added an infrared camera, a cold mirror, a halogen lamp, and a monitor. The software was generated to show the multifocal hexagonal stimulus array on the monitor using C++Builder XE8 and to move the center of the array up and down as well as back and forth. We used a multifunction I/O device and its design platform LabVIEW for data retrieval. The plate electrodes were used to measure electrodermal activities around the eyes. We used a multifocal hexagonal stimulus array with 37 elements in the software. The center of the multifocal hexagonal stimulus array could be adjusted to the same position as the examination target of the precise perimetry. We successfully added the moving ERG function to the experimental ophthalmologic device.
Keywords: Moving ERG, precise perimetry, retinal layers, visual sensitivity.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7831467 A Framework for Enhancing Mobile Development Software for Rangsit University, Thailand
Authors: Thossaporn Thossansin
Abstract:
This paper presents the development of a mobile application for students at the Faculty of Information Technology, Rangsit University (RSU), Thailand. RSU upgrades an enrollment process by improving its information systems. Students can download the RSU APP easily in order to access the RSU substantial information. The reason of having a mobile application is to help students to access the system regardless of time and place. The objectives of this paper include: 1. To develop an application on iOS platform for those students at the Faculty of Information Technology, Rangsit University, Thailand. 2. To obtain the students’ perception towards the new mobile app. The target group is those from the freshman year till the senior year of the faculty of Information Technology, Rangsit University. The new mobile application, called as RSU APP, is developed by the department of Information Technology, Rangsit University. It contains useful features and various functionalities particularly on those that can give support to students. The core contents of the app consist of RSU’s announcement, calendar, events, activities, and ebook. The mobile app is developed on the iOS platform. The user satisfaction is analyzed from the interview data from 81 interviewees as well as a Google application like a Google form which 122 interviewees are involved. The result shows that users are satisfied with the application as they score it the most satisfaction level at 4.67 SD 0.52. The score for the question if users can learn and use the application quickly is high which is 4.82 SD 0.71. On the other hand, the lowest satisfaction rating is in the app’s form, apps lists, with the satisfaction level as 4.01 SD 0.45.Keywords: Mobile application, development of mobile application, framework of mobile development, software development for mobile devices.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16961466 Diversity for Safety and Security of Autonomous Vehicles against Accidental and Deliberate Faults
Authors: Anil Ranjitbhai Patel, Clement John Shaji, Peter Liggesmeyer
Abstract:
Safety and security of Autonomous Vehicles (AVs) is a growing concern, first, due to the increased number of safety-critical functions taken over by automotive embedded systems; second, due to the increased exposure of the software-intensive systems to potential attackers; third, due to dynamic interaction in an uncertain and unknown environment at runtime which results in changed functional and non-functional properties of the system. Frequently occurring environmental uncertainties, random component failures, and compromise security of the AVs might result in hazardous events, sometimes even in an accident, if left undetected. Beyond these technical issues, we argue that the safety and security of AVs against accidental and deliberate faults are poorly understood and rarely implemented. One possible way to overcome this is through a well-known diversity approach. As an effective approach to increase safety and security, diversity has been widely used in the aviation, railway, and aerospace industries. Thus, paper proposes fault-tolerance by diversity model taking into consideration the mitigation of accidental and deliberate faults by application of structure and variant redundancy. The model can be used to design the AVs with various types of diversity in hardware and software-based multi-version system. The paper evaluates the presented approach by employing an example from adaptive cruise control, followed by discussing the case study with initial findings.
Keywords: Autonomous vehicles, diversity, fault-tolerance, adaptive cruise control, safety, security.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4871465 A Software-Supported Methodology for Designing General-Purpose Interconnection Networks for Reconfigurable Architectures
Authors: Kostas Siozios, Dimitrios Soudris, Antonios Thanailakis
Abstract:
Modern applications realized onto FPGAs exhibit high connectivity demands. Throughout this paper we study the routing constraints of Virtex devices and we propose a systematic methodology for designing a novel general-purpose interconnection network targeting to reconfigurable architectures. This network consists of multiple segment wires and SB patterns, appropriately selected and assigned across the device. The goal of our proposed methodology is to maximize the hardware utilization of fabricated routing resources. The derived interconnection scheme is integrated on a Virtex style FPGA. This device is characterized both for its high-performance, as well as for its low-energy requirements. Due to this, the design criterion that guides our architecture selections was the minimal Energy×Delay Product (EDP). The methodology is fully-supported by three new software tools, which belong to MEANDER Design Framework. Using a typical set of MCNC benchmarks, extensive comparison study in terms of several critical parameters proves the effectiveness of the derived interconnection network. More specifically, we achieve average Energy×Delay Product reduction by 63%, performance increase by 26%, reduction in leakage power by 21%, reduction in total energy consumption by 11%, at the expense of increase of channel width by 20%.
Keywords: Design Methodology, FPGA, Interconnection, Low-Energy, High-Performance, CAD tool.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17211464 A Software Framework for Predicting Oil-Palm Yield from Climate Data
Authors: Mohd. Noor Md. Sap, A. Majid Awan
Abstract:
Intelligent systems based on machine learning techniques, such as classification, clustering, are gaining wide spread popularity in real world applications. This paper presents work on developing a software system for predicting crop yield, for example oil-palm yield, from climate and plantation data. At the core of our system is a method for unsupervised partitioning of data for finding spatio-temporal patterns in climate data using kernel methods which offer strength to deal with complex data. This work gets inspiration from the notion that a non-linear data transformation into some high dimensional feature space increases the possibility of linear separability of the patterns in the transformed space. Therefore, it simplifies exploration of the associated structure in the data. Kernel methods implicitly perform a non-linear mapping of the input data into a high dimensional feature space by replacing the inner products with an appropriate positive definite function. In this paper we present a robust weighted kernel k-means algorithm incorporating spatial constraints for clustering the data. The proposed algorithm can effectively handle noise, outliers and auto-correlation in the spatial data, for effective and efficient data analysis by exploring patterns and structures in the data, and thus can be used for predicting oil-palm yield by analyzing various factors affecting the yield.Keywords: Pattern analysis, clustering, kernel methods, spatial data, crop yield
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19791463 Development of a Software about Calculating the Production Parameters in Knitted Garment Plants
Authors: Ender Bulgun, Arzu Vuruskan
Abstract:
Apparel product development is an important stage in the life cycle of a product. Shortening this stage will help to reduce the costs of a garment. The aim of this study is to examine the production parameters in knitwear apparel companies by defining the unit costs, and developing a software to calculate the unit costs of garments and make the cost estimates. In this study, with the help of a questionnaire, different companies- systems of unit cost estimating and cost calculating were tried to be analyzed. Within the scope of the questionnaire, the importance of cost estimating process for apparel companies and the expectations from a new cost estimating program were investigated. According to the results of the questionnaire, it was seen that the majority of companies which participated to the questionnaire use manual cost calculating methods or simple Microsoft Excel spreadsheets to make cost estimates. Furthermore, it was discovered that many companies meet with difficulties in archiving the cost data for future use and as a solution to that problem, it is thought that prior to making a cost estimate, sub units of garment costs which are fabric, accessory and the labor costs should be analyzed and added to the database of the programme beforehand. Another specification of the cost estimating unit prepared in this study is that the programme was designed to consist of two main units, one of which makes the product specification and the other makes the cost calculation. The programme is prepared as a web-based application in order that the supplier, the manufacturer and the customer can have the opportunity to communicate through the same platform.
Keywords: Apparel, cost estimating, design archive.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 29821462 FEM and Experimental Modal Analysis of Computer Mount
Authors: Vishwajit M. Ghatge, David Looper
Abstract:
Over the last few decades, oilfield service rolling equipment has significantly increased in weight, primarily because of emissions regulations, which require larger/heavier engines, larger cooling systems, and emissions after-treatment systems, in some cases, etc. Larger engines cause more vibration and shock loads, leading to failure of electronics and control systems. If the vibrating frequency of the engine matches the system frequency, high resonance is observed on structural parts and mounts. One such existing automated control equipment system comprising wire rope mounts used for mounting computers was designed approximately 12 years ago. This includes the use of an industrialgrade computer to control the system operation. The original computer had a smaller, lighter enclosure. After a few years, a newer computer version was introduced, which was 10 lbm heavier. Some failures of internal computer parts have been documented for cases in which the old mounts were used. Because of the added weight, there is a possibility of having the two brackets impact each other under off-road conditions, which causes a high shock input to the computer parts. This added failure mode requires validating the existing mount design to suit the new heavy-weight computer. This paper discusses the modal finite element method (FEM) analysis and experimental modal analysis conducted to study the effects of vibration on the wire rope mounts and the computer. The existing mount was modelled in ANSYS software, and resultant mode shapes and frequencies were obtained. The experimental modal analysis was conducted, and actual frequency responses were observed and recorded. Results clearly revealed that at resonance frequency, the brackets were colliding and potentially causing damage to computer parts. To solve this issue, spring mounts of different stiffness were modeled in ANSYS software, and the resonant frequency was determined. Increasing the stiffness of the system increased the resonant frequency zone away from the frequency window at which the engine showed heavy vibrations or resonance. After multiple iterations in ANSYS software, the stiffness of the spring mount was finalized, which was again experimentally validated.
Keywords: Experimental Modal Analysis, FEM Modal Analysis, Frequency, Modal Analysis, Resonance, Vibration.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 31921461 Application of a Systemic Soft Domain-Driven Design Framework
Authors: Mohammed Salahat, Steve Wade, Izhar Ul-Haq
Abstract:
This paper proposes a “soft systems" approach to domain-driven design of computer-based information systems. We propose a systemic framework combining techniques from Soft Systems Methodology (SSM), the Unified Modelling Language (UML), and an implementation pattern known as “Naked Objects". We have used this framework in action research projects that have involved the investigation and modelling of business processes using object-oriented domain models and the implementation of software systems based on those domain models. Within the proposed framework, Soft Systems Methodology (SSM) is used as a guiding methodology to explore the problem situation and to generate a ubiquitous language (soft language) which can be used as the basis for developing an object-oriented domain model. The domain model is further developed using techniques based on the UML and is implemented in software following the “Naked Objects" implementation pattern. We argue that there are advantages from combining and using techniques from different methodologies in this way. The proposed systemic framework is overviewed and justified as multimethodologyusing Mingers multimethodology ideas. This multimethodology approach is being evaluated through a series of action research projects based on real-world case studies. A Peer-Tutoring case study is presented here as a sample of the framework evaluation processKeywords: SSM, UML, Domain-Driven Design, Soft Domain-Driven Design, Naked Objects, Soft Languag e.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17781460 Sensitivity of the SHARC Model to Variations of Manning Coefficient and Effect of “n“ on the Sediment Materials Entry into the Eastern Water intake- A Case in the Dez Diversion Weir in Iran
Authors: M.R.Mansoujian, A.Rohani, N.Hedayat , M.Qamari, M. Osroosh
Abstract:
Permanent rivers are the main sources of renewable water supply for the croplands under the irrigation and drainage schemes. They are also the major source of sediment loads transport into the storage reservoirs of the hydro-electrical dams, diversion weirs and regulating dams. Sedimentation process results from soil erosion which is related to poor watershed management and human intervention ion in the hydraulic regime of the rivers. These could change the hydraulic behavior and as such, leads to riverbed and river bank scouring, the consequences of which would be sediment load transport into the dams and therefore reducing the flow discharge in water intakes. The present paper investigate sedimentation process by varying the Manning coefficient "n" by using the SHARC software along the watercourse in the Dez River. Results indicated that the optimum "n" within that river range is 0.0315 at which quantity minimum sediment loads are transported into the Eastern intake. Comparison of the model results with those obtained by those from the SSIIM software within the same river reach showed a very close proximity between them. This suggests a relative accuracy with which the model can simulate the hydraulic flow characteristics and therefore its suitability as a powerful analytical tool for project feasibility studies and project implementation.Keywords: Sediment transport, Manning coefficient, Eastern Intake, SHARC, Dez River.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16811459 Design of a 5-Joint Mechanical Arm with User-Friendly Control Program
Authors: Amon Tunwannarux, Supanunt Tunwannarux
Abstract:
This paper describes the design concepts and implementation of a 5-Joint mechanical arm for a rescue robot named CEO Mission II. The multi-joint arm is a five degree of freedom mechanical arm with a four bar linkage, which can be stretched to 125 cm. long. It is controlled by a teleoperator via the user-friendly control and monitoring GUI program. With Inverse Kinematics principle, we developed the method to control the servo angles of all arm joints to get the desired tip position. By clicking the determined tip position or dragging the tip of the mechanical arm on the computer screen to the desired target point, the robot will compute and move its multi-joint arm to the pose as seen on the GUI screen. The angles of each joint are calculated and sent to all joint servos simultaneously in order to move the mechanical arm to the desired pose at once. The operator can also use a joystick to control the movement of this mechanical arm and the locomotion of the robot. Many sensors are installed at the tip of this mechanical arm for surveillance from the high level and getting the vital signs of victims easier and faster in the urban search and rescue tasks. It works very effectively and easy to control. This mechanical arm and its software were developed as a part of the CEO Mission II Rescue Robot that won the First Runner Up award and the Best Technique award from the Thailand Rescue Robot Championship 2006. It is a low cost, simple, but functioning 5-Jiont mechanical arm which is built from scratch, and controlled via wireless LAN 802.11b/g. This 5-Jiont mechanical arm hardware concept and its software can also be used as the basic mechatronics to many real applications.Keywords: Multi-joint, mechanical arm, inverse kinematics, rescue robot, GUI control program.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18161458 The Effect of Tool Path Strategy on Surface and Dimension in High Speed Milling
Authors: A. Razavykia, A. Esmaeilzadeh, S. Iranmanesh
Abstract:
Many orthopedic implants like proximal humerus cases require lower surface roughness and almost immediate/short lead time surgery. Thus, rapid response from the manufacturer is very crucial. Tool path strategy of milling process has a direct influence on the surface roughness and lead time of medical implant. High-speed milling as promised process would improve the machined surface quality, but conventional or super-abrasive grinding still required which imposes some drawbacks such as additional costs and time. Currently, many CAD/CAM software offers some different tool path strategies to milling free form surfaces. Nevertheless, the users must identify how to choose the strategies according to cutting tool geometry, geometry complexity, and their effects on the machined surface. This study investigates the effect of different tool path strategies for milling a proximal humerus head during finishing operation on stainless steel 316L. Experiments have been performed using MAHO MH700 S vertical milling machine and four machining strategies, namely, spiral outward, spiral inward, and radial as well as zig-zag. In all cases, the obtained surfaces were analyzed in terms of roughness and dimension accuracy compared with those obtained by simulation. The findings provide evidence that surface roughness, dimensional accuracy, and machining time have been affected by the considered tool path strategy.Keywords: CAD/CAM software, milling, orthopedic implants, tool path strategy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9961457 Analysis of a Coupled Hydro-Sedimentological Numerical Model for the Tombolo of GIENS
Authors: Yves Lacroix, Van Van Than, Didier Leandri, Pierre Liardet
Abstract:
The western Tombolo of the Giens peninsula in southern France, known as Almanarre beach, is subject to coastal erosion. We are trying to use computer simulation in order to propose solutions to stop this erosion. Our aim was first to determine the main factors for this erosion and successfully apply a coupled hydrosedimentological numerical model based on observations and measurements that have been performed on the site for decades. We have gathered all available information and data about waves, winds, currents, tides, bathymetry, coastal line, and sediments concerning the site. These have been divided into two sets: one devoted to calibrating a numerical model using Mike 21 software, the other to serve as a reference in order to numerically compare the present situation to what it could be if we implemented different types of underwater constructions. This paper presents the first part of the study: selecting and melting different sources into a coherent data basis, identifying the main erosion factors, and calibrating the coupled software model against the selected reference period. Our results bring calibration of the numerical model with good fitting coefficients. They also show that the winter South-Western storm events conjugated to depressive weather conditions constitute a major factor of erosion, mainly due to wave impact in the northern part of the Almanarre beach. Together, current and wind impact is shown negligible.
Keywords: Almanarre beach, coastal erosion, hydro-sedimentological, numerical model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20421456 Orbit Determination Modeling with Graphical Demonstration
Authors: Assem M. F. Sallam, Ah. El-S. Makled
Abstract:
In this paper, there is an implementation, verification, and graphical demonstration of a software application, which can be used swiftly over different preliminary orbit determination methods. A passive orbit determination method is used in this study to determine the location of a satellite or a flying body. It is named a passive orbit determination because it depends on observation without the use of any aids (radio and laser) installed on satellite. In order to understand how these methods work and how their output is accurate when compared with available verification data, the built models help in knowing the different inputs used with each method. Output from the different orbit determination methods (Gibbs, Lambert, and Gauss) will be compared with each other and verified by the data obtained from Satellite Tool Kit (STK) application. A modified model including all of the orbit determination methods using the same input will be introduced to investigate different models output (orbital parameters) for the same input (azimuth, elevation, and time). Simulation software is implemented using MATLAB. A Graphical User Interface (GUI) application named OrDet is produced using the GUI of MATLAB. It includes all the available used inputs and it outputs the current Classical Orbital Elements (COE) of satellite under observation. Produced COE are then used to propagate for a complete revolution and plotted on a 3-D view. Modified model which uses an adapter to allow same input parameters, passes these parameters to the preliminary orbit determination methods under study. Result from all orbit determination methods yield exactly the same COE output, which shows the equality of concept in determination of satellite’s location, but with different numerical methods.
Keywords: Orbit determination, STK, MATLAB-GUI, satellite tracking.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15501455 Automated User Story Driven Approach for Web-Based Functional Testing
Authors: Mahawish Masud, Muhammad Iqbal, M. U. Khan, Farooque Azam
Abstract:
Manual writing of test cases from functional requirements is a time-consuming task. Such test cases are not only difficult to write but are also challenging to maintain. Test cases can be drawn from the functional requirements that are expressed in natural language. However, manual test case generation is inefficient and subject to errors. In this paper, we have presented a systematic procedure that could automatically derive test cases from user stories. The user stories are specified in a restricted natural language using a well-defined template. We have also presented a detailed methodology for writing our test ready user stories. Our tool “Test-o-Matic” automatically generates the test cases by processing the restricted user stories. The generated test cases are executed by using open source Selenium IDE. We evaluate our approach on a case study, which is an open source web based application. Effectiveness of our approach is evaluated by seeding faults in the open source case study using known mutation operators. Results show that the test case generation from restricted user stories is a viable approach for automated testing of web applications.Keywords: Automated testing, natural language, user story modeling, software engineering, software testing, test case specification, transformation and automation, user story, web application testing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 29431454 Analysis on Modeling and Simulink of DC Motor and its Driving System Used for Wheeled Mobile Robot
Authors: Wai Phyo Aung
Abstract:
Wheeled Mobile Robots (WMRs) are built with their Wheels- drive machine, Motors. Depend on their desire design of WMR, Technicians made used of DC Motors for motion control. In this paper, the author would like to analyze how to choose DC motor to be balance with their applications of especially for WMR. Specification of DC Motor that can be used with desire WMR is to be determined by using MATLAB Simulink model. Therefore, this paper is mainly focus on software application of MATLAB and Control Technology. As the driving system of DC motor, a Peripheral Interface Controller (PIC) based control system is designed including the assembly software technology and H-bridge control circuit. This Driving system is used to drive two DC gear motors which are used to control the motion of WMR. In this analyzing process, the author mainly focus the drive system on driving two DC gear motors that will control with Differential Drive technique to the Wheeled Mobile Robot . For the design analysis of Motor Driving System, PIC16F84A is used and five inputs of sensors detected data are tested with five ON/OFF switches. The outputs of PIC are the commands to drive two DC gear motors, inputs of Hbridge circuit .In this paper, Control techniques of PIC microcontroller and H-bridge circuit, Mechanism assignments of WMR are combined and analyzed by mainly focusing with the “Modeling and Simulink of DC Motor using MATLAB".Keywords: Control System Design, DC Motors, DifferentialDrive, H-bridge control circuit, MATLAB Simulink model, Peripheral Interface Controller (PIC), Wheeled Mobile Robots.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 113041453 A Qualitative Study of Health-Related Beliefs and Practices among Vegetarians
Authors: Lorena Antonovici, Maria Nicoleta Turliuc
Abstract:
The process of becoming a vegetarian involves changes in several life aspects, including health. Despite its relevance, however, little research has been carried out to analyze vegetarians' self-perceived health, and even less empirical attention has received in the Romanian population. This study aimed to assess health-related beliefs and practices among vegetarian adults in a Romanian sample. We have undertaken 20 semi-structured interviews (10 males, 10 females) based on a snowball sample with a mean age of 31 years. The interview guide was divided into three sections: causes of adopting the diet, general aspects (beliefs, practices, tensions, and conflicts) and consequences of adopting the diet (significant changes, positive aspects, and difficulties, physical and mental health). Additional anamnestic data were reported by means of a questionnaire. Data analyses were performed using Tropes text analysis software (v. 8.2) and SPSS software (v. 24.0.) Findings showed that most of the participants considered a vegetarian diet as a natural and healthy choice as opposed to meat-eating, which is not healthy, and its consumption should be moderated among omnivores. A higher proportion of participants (65%) had an average body mass index (BMI), and several women even assumed having certain affections that no longer occur after following a vegetarian diet. Moreover, participants admitted having better moods and mental health status, given their self-contentment with the dietary choice. Relatives were perceived as more skeptical about their practices than others, and especially women had this view. This study provides a valuable insight into health-related beliefs and practices and how a vegetarian diet might interact.
Keywords: Health-related beliefs, health, practices, vegetarians.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7771452 The Design Optimization for Sound Absorption Material of Multi-Layer Structure
Authors: Un-Hwan Park, Jun-Hyeok Heo, In-Sung Lee, Tae-Hyeon Oh, Dae-Kyu Park
Abstract:
Sound absorbing material is used as automotive interior material. Sound absorption coefficient should be predicted to design it. But it is difficult to predict sound absorbing coefficient because it is comprised of several material layers. So, its targets are achieved through many experimental tunings. It causes a lot of cost and time. In this paper, we propose the process to estimate the sound absorption coefficient with multi-layer structure. In order to estimate the coefficient, physical properties of each material are used. These properties also use predicted values by Foam-X software using the sound absorption coefficient data measured by impedance tube. Since there are many physical properties and the measurement equipment is expensive, the values predicted by software are used. Through the measurement of the sound absorption coefficient of each material, its physical properties are calculated inversely. The properties of each material are used to calculate the sound absorption coefficient of the multi-layer material. Since the absorption coefficient of multi-layer can be calculated, optimization design is possible through simulation. Then, we will compare and analyze the calculated sound absorption coefficient with the data measured by scaled reverberation chamber and impedance tubes for a prototype. If this method is used when developing automotive interior materials with multi-layer structure, the development effort can be reduced because it can be optimized by simulation. So, cost and time can be saved.
Keywords: Optimization design, multi-layer nonwoven, sound absorption coefficient, scaled reverberation chamber, impedance tubes.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10001451 Obfuscation Studio Executive
Authors: Siarhei Petryk, Vyacheslav Yarmolik
Abstract:
New software protection product called “Obfuscation Studio" is presented in the paper. Several obfuscating modules that are already implemented are described. Some theoretical data is presented, that shows the potency and effectiveness of described obfuscation methods. “Obfuscation Studio" is being implemented for protecting programs written for .NET platform, but the described methods can also be interesting for other applications.Keywords: Coupling, obfuscation, predicates, renaming.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11801450 Study Forecast Indoor Acoustics. A Case Study: the Auditorium Theatre-Hotel “Casa Tra Noi“
Authors: D. Germanò, D. Plutino, G. Cannistraro
Abstract:
The theatre-auditorium under investigation following the highly reflective characteristics of materials used in it (marble, painted wood, smooth plaster, etc), architectural and structural features of the Protocol and its intended use (very multifunctional: Auditorium, theatre, cinema, musicals, conference room) from the analysis of the statement of fact made by the acoustic simulation software Ramsete and supported by data obtained through a campaign of acoustic measurements of the state of fact made on the spot by a Fonomet Svantek model SVAN 957, appears to be acoustically inadequate. After the completion of the 3D model according to the specifications necessary software used forecast in order to be recognized by him, have made three simulations, acoustic simulation of the state of and acoustic simulation of two design solutions. Improved noise characteristics found in the first design solution, compared to the state in fact consists therefore in lowering Reverberation Time that you turn most desirable value, while the Indicators of Clarity, the Baricentric Time, the Lateral Efficiency, Ratio of Low Tmedia BR and defined the Speech Intelligibility improved significantly. Improved noise characteristics found instead in the second design solution, as compared to first design solution, is finally mostly in a more uniform distribution of Leq and in lowering Reverberation Time that you turn the optimum values. Indicators of Clarity, and the Lateral Efficiency improve further but at the expense of a value slightly worse than the BR. Slightly vary the remaining indices.Keywords: Indoor, Acoustic, Acoustic simulation
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 41941449 Wireless Sensor Networks for Swiftlet Farms Monitoring
Authors: Al-Khalid Othman, Wan A. Wan Zainal Abidin, Kee M. Lee, Hushairi Zen, Tengku. M. A. Zulcaffle, Kuryati Kipli
Abstract:
This paper provides an in-depth study of Wireless Sensor Network (WSN) application to monitor and control the swiftlet habitat. A set of system design is designed and developed that includes the hardware design of the nodes, Graphical User Interface (GUI) software, sensor network, and interconnectivity for remote data access and management. System architecture is proposed to address the requirements for habitat monitoring. Such applicationdriven design provides and identify important areas of further work in data sampling, communications and networking. For this monitoring system, a sensor node (MTS400), IRIS and Micaz radio transceivers, and a USB interfaced gateway base station of Crossbow (Xbow) Technology WSN are employed. The GUI of this monitoring system is written using a Laboratory Virtual Instrumentation Engineering Workbench (LabVIEW) along with Xbow Technology drivers provided by National Instrument. As a result, this monitoring system is capable of collecting data and presents it in both tables and waveform charts for further analysis. This system is also able to send notification message by email provided Internet connectivity is available whenever changes on habitat at remote sites (swiftlet farms) occur. Other functions that have been implemented in this system are the database system for record and management purposes; remote access through the internet using LogMeIn software. Finally, this research draws a conclusion that a WSN for monitoring swiftlet habitat can be effectively used to monitor and manage swiftlet farming industry in Sarawak.Keywords: Swiftlet, WSN, Habitat Monitoring, Networking.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 27561448 Hash Based Block Matching for Digital Evidence Image Files from Forensic Software Tools
Abstract:
Internet use, intelligent communication tools, and social media have all become an integral part of our daily life as a result of rapid developments in information technology. However, this widespread use increases crimes committed in the digital environment. Therefore, digital forensics, dealing with various crimes committed in digital environment, has become an important research topic. It is in the research scope of digital forensics to investigate digital evidences such as computer, cell phone, hard disk, DVD, etc. and to report whether it contains any crime related elements. There are many software and hardware tools developed for use in the digital evidence acquisition process. Today, the most widely used digital evidence investigation tools are based on the principle of finding all the data taken place in digital evidence that is matched with specified criteria and presenting it to the investigator (e.g. text files, files starting with letter A, etc.). Then, digital forensics experts carry out data analysis to figure out whether these data are related to a potential crime. Examination of a 1 TB hard disk may take hours or even days, depending on the expertise and experience of the examiner. In addition, it depends on examiner’s experience, and may change overall result involving in different cases overlooked. In this study, a hash-based matching and digital evidence evaluation method is proposed, and it is aimed to automatically classify the evidence containing criminal elements, thereby shortening the time of the digital evidence examination process and preventing human errors.
Keywords: Block matching, digital evidence, hash list.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13581447 Estimation of Crustal Thickness within the Sokoto Basin North-Western Nigeria Using Bouguer Gravity Anomaly Data
Authors: T. T. Olugbenga, A. I. Augie
Abstract:
This research proposes an interpretation of the Bouguer’ gravity anomaly data of some parts of Sokoto basin for the estimation of crustal thickness. The study area is bounded between latitudes 1100′0″N and 1300′0″N, and longitudes 400′0″E and 600′0″E that covered Koko, Jega, B/Kebbi, Argungu, Lema, Bodinga, Tamgaza, Gunmi,Daki Takwas, Dange, Sokoto, Ilella, T/Mafara, Anka, Maru, Gusau, K/Namoda, and Sabon Birni within Sokoto, Kebbi and Zamfara state respectively. The established map of the study area was digitized in X, Y and Z format using excel software package and the digitized data were processed using Surfer version 13 software. The Moho and Conrad depths based on a relationship between Bouguer’ gravity anomaly determined crustal thickness were estimated as 35 to 37 km and 19 to 21 km, respectively. The crustal region has been categorized into: Crustal thinning zone that is the region with high gravity anomaly value due to its greater geothermal energy and also Crustal thickening zone which the region with low anomaly values due to its lower geothermal energy. Birnin kebbi, Jega, Sokoto were identified as the region of hydrocarbon potential with an estimate of 35 km thickness within the crustal region which is referred to as crustal thickening as a result of its low but sufficient geothermal energy to decompose organic matter within the region to form hydrocarbons.
Keywords: Bouguer gravity anomaly, crustal thickness, geothermal energy, hydrocarbons, Moho and Conrad Depths.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6521446 Evaluation of Seismic Behavior of Steel Shear Wall with Opening with Hardener and Beam with Reduced Cross Section under Cycle Loading with Finite Element Analysis Method
Authors: Masoud Mahdavi
Abstract:
During an earthquake, the structure is subjected to seismic loads that cause tension in the members of the building. The use of energy dissipation elements in the structure reduces the percentage of seismic forces on the main members of the building (especially the columns). Steel plate shear wall, as one of the most widely used types of energy dissipation element, has evolved today, and regular drilling of its inner plate is one of the common cases. In the present study, using a finite element method, the shear wall of the steel plate is designed as a floor (with dimensions of 447 × 6/246 cm) with Abacus software and in three different modes on which a cyclic load has been applied. The steel shear wall has a horizontal element (beam) with a reduced beam section (RBS). The hole in the interior plate of the models is created in such a way that it has the process of increasing the area, which makes the effect of increasing the surface area of the hole on the seismic performance of the steel shear wall completely clear. In the end, it was found that with increasing the opening level in the steel shear wall (with reduced cross-section beam), total displacement and plastic strain indicators increased, structural capacity and total energy indicators decreased and the Mises Monson stress index did not change much.
Keywords: Steel plate shear wall with opening, cyclic loading, reduced cross-section beam, finite element method, Abaqus Software.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6281445 Evaluating Emission Reduction Due to a Proposed Light Rail Service: A Micro-Level Analysis
Authors: Saeid Eshghi, Neeraj Saxena, Abdulmajeed Alsultan
Abstract:
Carbon dioxide (CO2) alongside other gas emissions in the atmosphere cause a greenhouse effect, resulting in an increase of the average temperature of the planet. Transportation vehicles are among the main contributors of CO2 emission. Stationary vehicles with initiated motors produce more emissions than mobile ones. Intersections with traffic lights that force the vehicles to become stationary for a period of time produce more CO2 pollution than other parts of the road. This paper focuses on analyzing the CO2 produced by the traffic flow at Anzac Parade Road - Barker Street intersection in Sydney, Australia, before and after the implementation of Light rail transport (LRT). The data are gathered during the construction phase of the LRT by collecting the number of vehicles on each path of the intersection for 15 minutes during the evening rush hour of 1 week (6-7 pm, July 04-31, 2018) and then multiplied by 4 to calculate the flow of vehicles in 1 hour. For analyzing the data, the microscopic simulation software “VISSIM” has been used. Through the analysis, the traffic flow was processed in three stages: before and after implementation of light rail train, and one during the construction phase. Finally, the traffic results were input into another software called “EnViVer”, to calculate the amount of CO2 during 1 h. The results showed that after the implementation of the light rail, CO2 will drop by a minimum of 13%. This finding provides an evidence that light rail is a sustainable mode of transport.Keywords: Carbon dioxide, emission modeling, light rail, microscopic model, traffic flow.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9461444 Identification of Outliers in Flood Frequency Analysis: Comparison of Original and Multiple Grubbs-Beck Test
Authors: Ayesha S. Rahman, Khaled Haddad, Ataur Rahman
Abstract:
At-site flood frequency analysis is used to estimate flood quantiles when at-site record length is reasonably long. In Australia, FLIKE software has been introduced for at-site flood frequency analysis. The advantage of FLIKE is that, for a given application, the user can compare a number of most commonly adopted probability distributions and parameter estimation methods relatively quickly using a windows interface. The new version of FLIKE has been incorporated with the multiple Grubbs and Beck test which can identify multiple numbers of potentially influential low flows. This paper presents a case study considering six catchments in eastern Australia which compares two outlier identification tests (original Grubbs and Beck test and multiple Grubbs and Beck test) and two commonly applied probability distributions (Generalized Extreme Value (GEV) and Log Pearson type 3 (LP3)) using FLIKE software. It has been found that the multiple Grubbs and Beck test when used with LP3 distribution provides more accurate flood quantile estimates than when LP3 distribution is used with the original Grubbs and Beck test. Between these two methods, the differences in flood quantile estimates have been found to be up to 61% for the six study catchments. It has also been found that GEV distribution (with L moments) and LP3 distribution with the multiple Grubbs and Beck test provide quite similar results in most of the cases; however, a difference up to 38% has been noted for flood quantiles for annual exceedance probability (AEP) of 1 in 100 for one catchment. This finding needs to be confirmed with a greater number of stations across other Australian states.
Keywords: Floods, FLIKE, probability distributions, flood frequency, outlier.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3311