Search results for: hardware in loop (HIL)
310 Optimization of the Mechanical Performance of Fused Filament Fabrication Parts
Authors: Iván Rivet, Narges Dialami, Miguel Cervera, Michele Chiumenti
Abstract:
Process parameters in Additive Manufacturing (AM) play a critical role in the mechanical performance of the final component. In order to find the input configuration that guarantees the optimal performance of the printed part, the process-performance relationship must be found. Fused Filament Fabrication (FFF) is the selected demonstrative AM technology due to its great popularity in the industrial manufacturing world. A material model that considers the different printing patterns present in a FFF part is used. A voxelized mesh is built from the manufacturing toolpaths described in the G-Code file. An Adaptive Mesh Refinement (AMR) based on the octree strategy is used in order to reduce the complexity of the mesh while maintaining its accuracy. High-fidelity and cost-efficient Finite Element (FE) simulations are performed and the influence of key process parameters in the mechanical performance of the component is analyzed. A robust optimization process based on appropriate failure criteria is developed to find the printing direction that leads to the optimal mechanical performance of the component. The Tsai-Wu failure criterion is implemented due to the orthotropy and heterogeneity constitutive nature of FFF components and because of the differences between the strengths in tension and compression. The optimization loop implements a modified version of an Anomaly Detection (AD) algorithm and uses the computed metrics to obtain the optimal printing direction. The developed methodology is verified with a case study on an industrial demonstrator.Keywords: additive manufacturing, optimization, printing direction, mechanical performance, voxelization
Procedia PDF Downloads 64309 Surface Roughness Prediction Using Numerical Scheme and Adaptive Control
Authors: Michael K.O. Ayomoh, Khaled A. Abou-El-Hossein., Sameh F.M. Ghobashy
Abstract:
This paper proposes a numerical modelling scheme for surface roughness prediction. The approach is premised on the use of 3D difference analysis method enhanced with the use of feedback control loop where a set of adaptive weights are generated. The surface roughness values utilized in this paper were adapted from [1]. Their experiments were carried out using S55C high carbon steel. A comparison was further carried out between the proposed technique and those utilized in [1]. The experimental design has three cutting parameters namely: depth of cut, feed rate and cutting speed with twenty-seven experimental sample-space. The simulation trials conducted using Matlab software is of two sub-classes namely: prediction of the surface roughness readings for the non-boundary cutting combinations (NBCC) with the aid of the known surface roughness readings of the boundary cutting combinations (BCC). The following simulation involved the use of the predicted outputs from the NBCC to recover the surface roughness readings for the boundary cutting combinations (BCC). The simulation trial for the NBCC attained a state of total stability in the 7th iteration i.e. a point where the actual and desired roughness readings are equal such that error is minimized to zero by using a set of dynamic weights generated in every following simulation trial. A comparative study among the three methods showed that the proposed difference analysis technique with adaptive weight from feedback control, produced a much accurate output as against the abductive and regression analysis techniques presented in this.Keywords: Difference Analysis, Surface Roughness; Mesh- Analysis, Feedback control, Adaptive weight, Boundary Element
Procedia PDF Downloads 622308 Determinants of the Users Intention of Social-Local-Mobile Applications
Authors: Chia-Chen Chen, Mu-Yen Chen
Abstract:
In recent years, with the vigorous growth of hardware and software technologies of smart mobile devices coupling with the rapid increase of social network influence, mobile commerce also presents the commercial operation mode of the future mainstream. For the time being, SoLoMo has become one of the very popular commercial models, its full name and meaning mainly refer to that users can obtain three key service types through smart mobile devices (Mobile) and omnipresent network services, and then link to the social (Social) web site platform to obtain the information exchange, again collocating with position and situational awareness technology to get the service suitable for the location (Local), through anytime, anywhere and any personal use of different mobile devices to provide the service concept of seamless integration style, and more deriving infinite opportunities of the future. The study tries to explore the use intention of users with SoLoMo mobile application formula, proposing research model to integrate TAM, ISSM, IDT and network externality, and with questionnaires to collect data and analyze results to verify the hypothesis, results show that perceived ease-of-use (PEOU), perceived usefulness (PU), and network externality have significant impact on the use intention with SoLoMo mobile application formula, and the information quality, relative advantages and observability have impacts on the perceived usefulness, and further affecting the use intention.Keywords: SoLoMo (social, local, and mobile), technology acceptance model, innovation diffusion theory, network externality
Procedia PDF Downloads 529307 Effect of Reynolds Number and Concentration of Biopolymer (Gum Arabic) on Drag Reduction of Turbulent Flow in Circular Pipe
Authors: Kamaljit Singh Sokhal, Gangacharyulu Dasoraju, Vijaya Kumar Bulasara
Abstract:
Biopolymers are popular in many areas, like petrochemicals, food industry and agriculture due to their favorable properties like environment-friendly, availability, and cost. In this study, a biopolymer gum Arabic was used to find its effect on the pressure drop at various concentrations (100 ppm – 300 ppm) with various Reynolds numbers (10000 – 45000). A rheological study was also done by using the same concentrations to find the effect of the shear rate on the shear viscosity. Experiments were performed to find the effect of injection of gum Arabic directly near the boundary layer and to investigate its effect on the maximum possible drag reduction. Experiments were performed on a test section having i.d of 19.50 mm and length of 3045 mm. The polymer solution was injected from the top of the test section by using a peristaltic pump. The concentration of the polymer solution and the Reynolds number were used as parameters to get maximum possible drag reduction. Water was circulated through a centrifugal pump having a maximum 3000 rpm and the flow rate was measured by using rotameter. Results were validated by using Virk's maximum drag reduction asymptote. A maximum drag reduction of 62.15% was observed with the maximum concentration of gum Arabic, 300 ppm. The solution was circulated in the closed loop to find the effect of degradation of polymers with a number of cycles on the drag reduction percentage. It was observed that the injection of the polymer solution in the boundary layer was showing better results than premixed solutions.Keywords: drag reduction, shear viscosity, gum arabic, injection point
Procedia PDF Downloads 139306 Approach for Updating a Digital Factory Model by Photogrammetry
Authors: R. Hellmuth, F. Wehner
Abstract:
Factory planning has the task of designing products, plants, processes, organization, areas, and the construction of a factory. The requirements for factory planning and the building of a factory have changed in recent years. Regular restructuring is becoming more important in order to maintain the competitiveness of a factory. Restrictions in new areas, shorter life cycles of product and production technology as well as a VUCA world (Volatility, Uncertainty, Complexity & Ambiguity) lead to more frequent restructuring measures within a factory. A digital factory model is the planning basis for rebuilding measures and becomes an indispensable tool. Short-term rescheduling can no longer be handled by on-site inspections and manual measurements. The tight time schedules require up-to-date planning models. Due to the high adaptation rate of factories described above, a methodology for rescheduling factories on the basis of a modern digital factory twin is conceived and designed for practical application in factory restructuring projects. The focus is on rebuild processes. The aim is to keep the planning basis (digital factory model) for conversions within a factory up to date. This requires the application of a methodology that reduces the deficits of existing approaches. The aim is to show how a digital factory model can be kept up to date during ongoing factory operation. A method based on photogrammetry technology is presented. The focus is on developing a simple and cost-effective solution to track the many changes that occur in a factory building during operation. The method is preceded by a hardware and software comparison to identify the most economical and fastest variant.Keywords: digital factory model, photogrammetry, factory planning, restructuring
Procedia PDF Downloads 117305 Cache Analysis and Software Optimizations for Faster on-Chip Network Simulations
Authors: Khyamling Parane, B. M. Prabhu Prasad, Basavaraj Talawar
Abstract:
Fast simulations are critical in reducing time to market in CMPs and SoCs. Several simulators have been used to evaluate the performance and power consumed by Network-on-Chips. Researchers and designers rely upon these simulators for design space exploration of NoC architectures. Our experiments show that simulating large NoC topologies take hours to several days for completion. To speed up the simulations, it is necessary to investigate and optimize the hotspots in simulator source code. Among several simulators available, we choose Booksim2.0, as it is being extensively used in the NoC community. In this paper, we analyze the cache and memory system behaviour of Booksim2.0 to accurately monitor input dependent performance bottlenecks. Our measurements show that cache and memory usage patterns vary widely based on the input parameters given to Booksim2.0. Based on these measurements, the cache configuration having least misses has been identified. To further reduce the cache misses, we use software optimization techniques such as removal of unused functions, loop interchanging and replacing post-increment operator with pre-increment operator for non-primitive data types. The cache misses were reduced by 18.52%, 5.34% and 3.91% by employing above technology respectively. We also employ thread parallelization and vectorization to improve the overall performance of Booksim2.0. The OpenMP programming model and SIMD are used for parallelizing and vectorizing the more time-consuming portions of Booksim2.0. Speedups of 2.93x and 3.97x were observed for the Mesh topology with 30 × 30 network size by employing thread parallelization and vectorization respectively.Keywords: cache behaviour, network-on-chip, performance profiling, vectorization
Procedia PDF Downloads 200304 Neural Network Supervisory Proportional-Integral-Derivative Control of the Pressurized Water Reactor Core Power Load Following Operation
Authors: Derjew Ayele Ejigu, Houde Song, Xiaojing Liu
Abstract:
This work presents the particle swarm optimization trained neural network (PSO-NN) supervisory proportional integral derivative (PID) control method to monitor the pressurized water reactor (PWR) core power for safe operation. The proposed control approach is implemented on the transfer function of the PWR core, which is computed from the state-space model. The PWR core state-space model is designed from the neutronics, thermal-hydraulics, and reactivity models using perturbation around the equilibrium value. The proposed control approach computes the control rod speed to maneuver the core power to track the reference in a closed-loop scheme. The particle swarm optimization (PSO) algorithm is used to train the neural network (NN) and to tune the PID simultaneously. The controller performance is examined using integral absolute error, integral time absolute error, integral square error, and integral time square error functions, and the stability of the system is analyzed by using the Bode diagram. The simulation results indicated that the controller shows satisfactory performance to control and track the load power effectively and smoothly as compared to the PSO-PID control technique. This study will give benefit to design a supervisory controller for nuclear engineering research fields for control application.Keywords: machine learning, neural network, pressurized water reactor, supervisory controller
Procedia PDF Downloads 157303 Prediction of Pounding between Two SDOF Systems by Using Link Element Based On Mathematic Relations and Suggestion of New Equation for Impact Damping Ratio
Authors: Seyed M. Khatami, H. Naderpour, R. Vahdani, R. C. Barros
Abstract:
Many previous studies have been carried out to calculate the impact force and the dissipated energy between two neighboring buildings during seismic excitation, when they collide with each other. Numerical studies are an important part of impact, which several researchers have tried to simulate the impact by using different formulas. Estimation of the impact force and the dissipated energy depends significantly on some parameters of impact. Mass of bodies, stiffness of spring, coefficient of restitution, damping ratio of dashpot and impact velocity are some known and unknown parameters to simulate the impact and measure dissipated energy during collision. Collision is usually shown by force-displacement hysteresis curve. The enclosed area of the hysteresis loop explains the dissipated energy during impact. In this paper, the effect of using different types of impact models is investigated in order to calculate the impact force. To increase the accuracy of impact model and to optimize the results of simulations, a new damping equation is assumed and is validated to get the best results of impact force and dissipated energy, which can show the accuracy of suggested equation of motion in comparison with other formulas. This relation is called "n-m". Based on mathematical relation, an initial value is selected for the mentioned coefficients and kinetic energy loss is calculated. After each simulation, kinetic energy loss and energy dissipation are compared with each other. If they are equal, selected parameters are true and, if not, the constant of parameters are modified and a new analysis is performed. Finally, two unknown parameters are suggested to estimate the impact force and calculate the dissipated energy.Keywords: impact force, dissipated energy, kinetic energy loss, damping relation
Procedia PDF Downloads 553302 Solid Waste Pollution and the Importance of Environmental Planning in Managing and Preserving the Public Environment in Benghazi City and Its Surrounding Areas
Authors: Abdelsalam Omran Gebril
Abstract:
Pollution and solid waste are the most important environmental problems plaguing the city of Benghazi as well as other cities and towns in Libya. These problems are caused by the lack of environmental planning and sound environmental management. Environmental planning is very important at present for the development of projects that preserve the environment, therefore, the planning process should be prioritized over the management process. Pollution caused by poor planning and environmental management exists not only in Benghazi but also in all other Libyan cities. This study was conducted through various field visits to several neighborhoods and areas within Benghazi as well as its neighboring regions. Follow-ups in these areas were conducted from March 2013 to October 2013 and documented by photographs. The existing methods of waste collection and means of transportation were investigated. Interviews were conducted with relevant authorities, including the Environment Public Authority in Benghazi and the Public Service Company of Benghazi. The objective of this study is to determine the causes of solid waste pollution in Benghazi City and its surrounding areas. Results show that solid waste pollution in Benghazi and its surrounding areas is the result of poor planning and environmental management, population growth, and the lack of hardware and equipment for the collection and transport of waste from the city to the landfill site. One of the most important recommendations in this study is the development of a complete and comprehensive plan that includes environmental planning and environmental management to reduce solid waste pollution.Keywords: solid waste, pollution, environmental planning, management, Benghazi, Libya
Procedia PDF Downloads 316301 Bottleneck Modeling in Information Technology Service Management
Authors: Abhinay Puvvala, Veerendra Kumar Rai
Abstract:
A bottleneck situation arises when the outflow is lesser than the inflow in a pipe-like setup. A more practical interpretation of bottlenecks emphasizes on the realization of Service Level Objectives (SLOs) at given workloads. Our approach detects two key aspects of bottlenecks – when and where. To identify ‘when’ we continuously poll on certain key metrics such as resource utilization, processing time, request backlog and throughput at a system level. Further, when the slope of the expected sojourn time at a workload is greater than ‘K’ times the slope of expected sojourn time at the previous step of the workload while the workload is being gradually increased in discrete steps, a bottleneck situation arises. ‘K’ defines the threshold condition and is computed based on the system’s service level objectives. The second aspect of our approach is to identify the location of the bottleneck. In multi-tier systems with a complex network of layers, it is a challenging problem to locate bottleneck that affects the overall system performance. We stage the system by varying workload incrementally to draw a correlation between load increase and system performance to the point where Service Level Objectives are violated. During the staging process, multiple metrics are monitored at hardware and application levels. The correlations are drawn between metrics and the overall system performance. These correlations along with the Service Level Objectives are used to arrive at the threshold conditions for each of these metrics. Subsequently, the same method used to identify when a bottleneck occurs is used on metrics data with threshold conditions to locate bottlenecks.Keywords: bottleneck, workload, service level objectives (SLOs), throughput, system performance
Procedia PDF Downloads 238300 Corrosion Risk Assessment/Risk Based Inspection (RBI)
Authors: Lutfi Abosrra, Alseddeq Alabaoub, Nuri Elhaloudi
Abstract:
Corrosion processes in the Oil & Gas industry can lead to failures that are usually costly to repair, costly in terms of loss of contaminated product, in terms of environmental damage and possibly costly in terms of human safety. This article describes the results of the corrosion review and criticality assessment done at Mellitah Gas (SRU unit) for pressure equipment and piping system. The information gathered through the review was intended for developing a qualitative RBI study. The corrosion criticality assessment has been carried out by applying company procedures and industrial recommended practices such as API 571, API 580/581, ASME PCC 3, which provides a guideline for establishing corrosion integrity assessment. The corrosion review is intimately related to the probability of failure (POF). During the corrosion study, the process units are reviewed by following the applicable process flow diagrams (PFDs) in the presence of Mellitah’s personnel from process engineering, inspection, and corrosion/materials and reliability engineers. The expected corrosion damage mechanism (internal and external) was identified, and the corrosion rate was estimated for every piece of equipment and corrosion loop in the process units. A combination of both Consequence and Likelihood of failure was used for determining the corrosion risk. A qualitative consequence of failure (COF) for each individual item was assigned based on the characteristics of the fluid as per its flammability, toxicity, and pollution into three levels (High, Medium, and Low). A qualitative probability of failure (POF)was applied to evaluate the internal and external degradation mechanism, a high-level point-based (0 to 10) for the purpose of risk prioritizing in the range of Low, Medium, and High.Keywords: corrosion, criticality assessment, RBI, POF, COF
Procedia PDF Downloads 82299 The Development of an Automated Computational Workflow to Prioritize Potential Resistance Variants in HIV Integrase Subtype C
Authors: Keaghan Brown
Abstract:
The prioritization of drug resistance mutations impacting protein folding or protein-drug and protein-DNA interactions within macromolecular systems is critical to the success of treatment regimens. With a continual increase in computational tools to assess these impacts, the need for scalability and reproducibility became an essential component of computational analysis and experimental research. Here it introduce a bioinformatics pipeline that combines several structural analysis tools in a simplified workflow, by optimizing the present computational hardware and software to automatically ease the flow of data transformations. Utilizing preestablished software tools, it was possible to develop a pipeline with a set of pre-defined functions that will automate mutation introduction into the HIV-1 Integrase protein structure, calculate the gain and loss of polar interactions and calculate the change in energy of protein fold. Additionally, an automated molecular dynamics analysis was implemented which reduces the constant need for user input and output management. The resulting pipeline, Automated Mutation Introduction and Analysis (AMIA) is an open source set of scripts designed to introduce and analyse the effects of mutations on the static protein structure as well as the results of the multi-conformational states from molecular dynamic simulations. The workflow allows the user to visualize all outputs in a user friendly manner thereby successfully enabling the prioritization of variant systems for experimental validation.Keywords: automated workflow, variant prioritization, drug resistance, HIV Integrase
Procedia PDF Downloads 78298 Design and Implementation Wireless System by Using Microcontrollers.Application for Drive Acquisition System with Multiple Sensors
Authors: H. Fekhar
Abstract:
Design and implementation acquisition system using radio frequency (RF) ASK module and micro controllers PIC is proposed in this work. The paper includes hardware and software design. The design tools are divided into two units , namely the sender MCU and receiver.The system was designed to measure temperatures of two furnaces and pressure pneumatic process. The wireless transmitter unit use the 433.95 MHz band directly interfaced to micro controller PIC18F4620. The sender unit consists of temperatures-pressure sensors , conditioning circuits , keypad GLCD display and RF module.Signal conditioner converts the output of the sensors into an electric quantity suitable for operation of the display and recording system.The measurements circuits are connected directly to 10 bits multiplexed A/D converter.The graphic liquid crystal display (GLCD) is used . The receiver (RF) module connected to a second microcontroller ,receive the signal via RF receiver , decode the Address/data and reproduces the original data . The strategy adopted for establishing communication between the sender MCU and receiver uses the specific protocol “Header, Address and data”.The communication protocol dealing with transmission and reception have been successfully implemented . Some experimental results are provided to demonstrate the effectiveness of the proposed wireless system. This embedded system track temperatures – pressure signal reasonably well with a small error.Keywords: microcontrollers, sensors, graphic liquid cristal display, protocol, temperature, pressure
Procedia PDF Downloads 461297 Jurisdictional Issues between Competition Law and Data Protection Law in Protection of Privacy of Online Consumers
Authors: Pankhudi Khandelwal
Abstract:
The revenue models of digital giants such as Facebook and Google, use targeted advertising for revenues. Such a model requires huge amounts of consumer data. While the data protection law deals with the protection of personal data, however, this data is acquired by the companies on the basis of consent, performance of a contract, or legitimate interests. This paper analyses the role that competition law can play in evading these loopholes for the protection of data and privacy of online consumers. Digital markets have certain distinctive features such as network effects and feedback loop, which gives incumbents of these markets a first-mover advantage. This creates a situation where the winner takes it all, thus creating entry barriers and concentration in the market. It has been also seen that this dominant position is then used by the undertakings for leveraging in other markets. This can be harmful to the consumers in form of less privacy, less choice, and stifling innovation, as seen in the cases of Facebook Cambridge Analytica, Google Shopping, and Google Android. Therefore, the article aims to provide a legal framework wherein the data protection law and competition law can come together to provide a balance in regulating digital markets. The issue has become more relevant in light of the Facebook decision by German competition authority, where it was held that Facebook had abused its dominant position by not complying with data protection rules, which constituted an exploitative practice. The paper looks into the jurisdictional boundaries that the data protection and competition authorities can work from and suggests ex ante regulation through data protection law and ex post regulation through competition law. It further suggests a change in the consumer welfare standard where harm to privacy should be considered as an indicator of low quality.Keywords: data protection, dominance, ex ante regulation, ex post regulation
Procedia PDF Downloads 184296 A Picture is worth a Billion Bits: Real-Time Image Reconstruction from Dense Binary Pixels
Authors: Tal Remez, Or Litany, Alex Bronstein
Abstract:
The pursuit of smaller pixel sizes at ever increasing resolution in digital image sensors is mainly driven by the stringent price and form-factor requirements of sensors and optics in the cellular phone market. Recently, Eric Fossum proposed a novel concept of an image sensor with dense sub-diffraction limit one-bit pixels (jots), which can be considered a digital emulation of silver halide photographic film. This idea has been recently embodied as the EPFL Gigavision camera. A major bottleneck in the design of such sensors is the image reconstruction process, producing a continuous high dynamic range image from oversampled binary measurements. The extreme quantization of the Poisson statistics is incompatible with the assumptions of most standard image processing and enhancement frameworks. The recently proposed maximum-likelihood (ML) approach addresses this difficulty, but suffers from image artifacts and has impractically high computational complexity. In this work, we study a variant of a sensor with binary threshold pixels and propose a reconstruction algorithm combining an ML data fitting term with a sparse synthesis prior. We also show an efficient hardware-friendly real-time approximation of this inverse operator. Promising results are shown on synthetic data as well as on HDR data emulated using multiple exposures of a regular CMOS sensor.Keywords: binary pixels, maximum likelihood, neural networks, sparse coding
Procedia PDF Downloads 204295 Iron Response Element-mRNA Binding to Iron Response Protein: Metal Ion Sensing
Authors: Mateen A. Khan, Elizabeth J. Theil, Dixie J. Goss
Abstract:
Cellular iron homeostasis is accomplished by the coordinated regulated expression of iron uptake, storage, and export. Iron regulate the translation of ferritin and mitochondrial aconitase iron responsive element (IRE)-mRNA by interaction with an iron regulatory protein (IRPs). Iron increases protein biosynthesis encoded in iron responsive element. The noncoding structure IRE-mRNA, approximately 30-nt, folds into a stem loop to control synthesis of proteins in iron trafficking, cell cycling, and nervous system function. Fluorescence anisotropy measurements showed the presence of one binding site on IRP1 for ferritin and mitochondrial aconitase IRE-mRNA. Scatchard analysis revealed the binding affinity (Kₐ) and average binding sites (n) for ferritin and mitochondrial aconitase IRE-mRNA were 68.7 x 10⁶ M⁻¹ and 9.2 x 10⁶ M⁻¹, respectively. In order to understand the relative importance of equilibrium and stability, we further report the contribution of electrostatic interactions in the overall binding of two IRE-mRNA with IRP1. The fluorescence quenching of IRP1 protein was measured at different ionic strengths. The binding affinity of IRE-mRNA to IRP1 decreases with increasing ionic strength, but the number of binding sites was independent of ionic strength. Such results indicate a differential contribution of electrostatics to the interaction of IRE-mRNA with IRP1, possibly related to helix bending or stem interactions and an overall conformational change. Selective destabilization of ferritin and mitochondrial aconitase RNA/protein complexes as reported here explain in part the quantitative differences in signal response to iron in vivo and indicate possible new regulatory interactions.Keywords: IRE-mRNA, IRP1, binding, ionic strength
Procedia PDF Downloads 130294 Drivers of Deforestation in the Colombian Amazon: An Empirical Causal Loop Diagram of Food Security and Land-Use Change
Authors: Jesica López, Deniz Koca, Asaf Tzachor
Abstract:
In 2016 the historic peace accord between the Colombian government and the Revolutionary Armed Forces of Colombia (FARC) had no strong mechanism for managing changes to land use and the environment. Since the end of a 60-year conflict in Colombia, large areas of forest in the Amazon region have been rapidly converted to agricultural uses, most recently by cattle ranching. This suggests that the peace agreement presents a threat to the conservation of the country's rainforest. We analyze the effects of cattle ranching as a driver and accelerator of deforestation from a systemic perspective, focusing on two key leverage points the legal and illegal activities involved in the cattle ranching practices. We map and understand the inherent dynamic complexity of deforestation, including factors such as land policy instruments, national strategy to tackle deforestation, land use nexus with Amazonian food systems, and loss of biodiversity. Our results show that deforestation inside Colombian Protected Areas (PAs) in the Amazon region and the surrounding buffer areas has accelerated with the onset of peace. By using a systems analysis approach, we contextualized the competition of land between cattle ranching and the need to protect tropical forests and their biodiversity loss. We elaborate on future recommendations for land use management decisions making suggest the inclusion of an Amazonian food system, interconnecting and visualizing the synergies between sustainable development goals, climate action (SDG 13) and life on land (SDG 15).Keywords: tropical rainforest, deforestation, sustainable land use, food security, Colombian Amazon
Procedia PDF Downloads 97293 Hash Based Block Matching for Digital Evidence Image Files from Forensic Software Tools
Abstract:
Internet use, intelligent communication tools, and social media have all become an integral part of our daily life as a result of rapid developments in information technology. However, this widespread use increases crimes committed in the digital environment. Therefore, digital forensics, dealing with various crimes committed in digital environment, has become an important research topic. It is in the research scope of digital forensics to investigate digital evidences such as computer, cell phone, hard disk, DVD, etc. and to report whether it contains any crime related elements. There are many software and hardware tools developed for use in the digital evidence acquisition process. Today, the most widely used digital evidence investigation tools are based on the principle of finding all the data taken place in digital evidence that is matched with specified criteria and presenting it to the investigator (e.g. text files, files starting with letter A, etc.). Then, digital forensics experts carry out data analysis to figure out whether these data are related to a potential crime. Examination of a 1 TB hard disk may take hours or even days, depending on the expertise and experience of the examiner. In addition, it depends on examiner’s experience, and may change overall result involving in different cases overlooked. In this study, a hash-based matching and digital evidence evaluation method is proposed, and it is aimed to automatically classify the evidence containing criminal elements, thereby shortening the time of the digital evidence examination process and preventing human errors.Keywords: block matching, digital evidence, hash list, evaluation of digital evidence
Procedia PDF Downloads 255292 Production of New Hadron States in Effective Field Theory
Authors: Qi Wu, Dian-Yong Chen, Feng-Kun Guo, Gang Li
Abstract:
In the past decade, a growing number of new hadron states have been observed, which are dubbed as XYZ states in the heavy quarkonium mass regions. In this work, we present our study on the production of some new hadron states. In particular, we investigate the processes Υ(5S,6S)→ Zb (10610)/Zb (10650)π, Bc→ Zc (3900)/Zc (4020)π and Λb→ Pc (4312)/Pc (4440)/Pc (4457)K. (1) For the production of Zb (10610)/Zb (10650) from Υ(5S,6S) decay, two types of bottom-meson loops were discussed within a nonrelativistic effective field theory. We found that the loop contributions with all intermediate states being the S-wave ground state bottom mesons are negligible, while the loops with one bottom meson being the broad B₀* or B₁' resonance could provide the dominant contributions to the Υ(5S)→ Zb⁽'⁾ π. (2) For the production of Zc (3900)/Zc (4020) from Bc decay, the branching ratios of Bc⁺→ Z (3900)⁺ π⁰ and Bc⁺→ Zc (4020)⁺ π⁰ are estimated to be of order of 10⁽⁻⁴⁾ and 10⁽⁻⁷⁾ in an effective Lagrangian approach. The large production rate of Zc (3900) could provide an important source of the production of Zc (3900) from the semi-exclusive decay of b-flavored hadrons reported by D0 Collaboration, which can be tested by the exclusive measurements in LHCb. (3) For the production of Pc (4312), Pc (4440) and Pc (4457) from Λb decay, the ratio of the branching fraction of Λb→ Pc K was predicted in a molecular scenario by using an effective Lagrangian approach, which is weakly dependent on our model parameter. We also find the ratios of the productions of the branching fractions of Λb→ Pc K and Pc→ J/ψ p can be well interpreted in the molecular scenario. Moreover, the estimated branching fractions of Λb→ Pc K are of order 10⁽⁻⁶⁾, which could be tested by further measurements in LHCb Collaboration.Keywords: effective Lagrangian approach, hadron loops, molecular states, new hadron states
Procedia PDF Downloads 133291 Design of SAE J2716 Single Edge Nibble Transmission Digital Sensor Interface for Automotive Applications
Authors: Jongbae Lee, Seongsoo Lee
Abstract:
Modern sensors often embed small-size digital controller for sensor control, value calibration, and signal processing. These sensors require digital data communication with host microprocessors, but conventional digital communication protocols are too heavy for price reduction. SAE J2716 SENT (single edge nibble transmission) protocol transmits direct digital waveforms instead of complicated analog modulated signals. In this paper, a SENT interface is designed in Verilog HDL (hardware description language) and implemented in FPGA (field-programmable gate array) evaluation board. The designed SENT interface consists of frame encoder/decoder, configuration register, tick period generator, CRC (cyclic redundancy code) generator/checker, and TX/RX (transmission/reception) buffer. Frame encoder/decoder is implemented as a finite state machine, and it controls whole SENT interface. Configuration register contains various parameters such as operation mode, tick length, CRC option, pause pulse option, and number of nibble data. Tick period generator generates tick signals from input clock. CRC generator/checker generates or checks CRC in the SENT data frame. TX/RX buffer stores transmission/received data. The designed SENT interface can send or receives digital data in 25~65 kbps at 3 us tick. Synthesized in 0.18 um fabrication technologies, it is implemented about 2,500 gates.Keywords: digital sensor interface, SAE J2716, SENT, verilog HDL
Procedia PDF Downloads 305290 A Numerical Study on Semi-Active Control of a Bridge Deck under Seismic Excitation
Authors: A. Yanik, U. Aldemir
Abstract:
This study investigates the benefits of implementing the semi-active devices in relation to passive viscous damping in the context of seismically isolated bridge structures. Since the intrinsically nonlinear nature of semi-active devices prevents the direct evaluation of Laplace transforms, frequency response functions are compiled from the computed time history response to sinusoidal and pulse-like seismic excitation. A simple semi-active control policy is used in regard to passive linear viscous damping and an optimal non-causal semi-active control strategy. The control strategy requires optimization. Euler-Lagrange equations are solved numerically during this procedure. The optimal closed-loop performance is evaluated for an idealized controllable dash-pot. A simplified single-degree-of-freedom model of an isolated bridge is used as numerical example. Two bridge cases are investigated. These cases are; bridge deck without the isolation bearing and bridge deck with the isolation bearing. To compare the performances of the passive and semi-active control cases, frequency dependent acceleration, velocity and displacement response transmissibility ratios Ta(w), Tv(w), and Td(w) are defined. To fully investigate the behavior of the structure subjected to the sinusoidal and pulse type excitations, different damping levels are considered. Numerical results showed that, under the effect of external excitation, bridge deck with semi-active control showed better structural performance than the passive bridge deck case.Keywords: bridge structures, passive control, seismic, semi-active control, viscous damping
Procedia PDF Downloads 242289 Analysis and Comparison of Prototypes of an Ergometric Step in a Multidisciplinary Design Process
Authors: M. B. Ricardo De Oliveira, A. Borghi-Silva, L. Di Thommazo, D. Braatz
Abstract:
Prototypes can be understood as representations of a product concept. Furthermore, prototyping consists in an important stage in product development and results in better team communication, decision making, testing and problem solving through feedback. Although there are several methods of prototyping suggested by recent studies for designers to choose from, some methods present different advantages, such as cost and time reduction, performance and fidelity, which should be taken in account during a product development project. In this multidisciplinary study, involving areas of physiotherapy, engineering and computer science (hardware and software), we compared four developed prototypes of an ergometric step: a virtual prototype, a 3D printed prototype, a bricolage prototype and a prototype manufactured by a third-party company. These prototypes were evaluated in a comparative-qualitative approach for their contribution to the concept’s maturation of the product, the different prototyping methods used and the advantages and disadvantages of each one based on the product’s design specifications (performance, safety, materials, cost, maintenance, usability, ergonomics and portability). Our results indicated that despite prototypes show overall advantages, all of them have limitations, thus being crucial to have different methods of testing and interacting with the product. Additionally, virtual and 3D printed prototypes were essential at early stages of the project due to their low-cost and high-fidelity representation of the product, while the prototype manufactured by a third-party company and bricolage prototype introduced functional tests in real scenarios, allowing more detailed evaluations. This study also resulted in a patent for an ergometric step.Keywords: Product Design, Product Development, Prototypes, Step
Procedia PDF Downloads 117288 Digital Forensic Exploration Framework for Email and Instant Messaging Applications
Authors: T. Manesh, Abdalla A. Alameen, M. Mohemmed Sha, A. Mohamed Mustaq Ahmed
Abstract:
Email and instant messaging applications are foremost and extensively used electronic communication methods in this era of information explosion. These applications are generally used for exchange of information using several frontend applications from various service providers by its users. Almost all such communications are now secured using SSL or TLS security over HTTP communication. At the same time, it is also noted that cyber criminals and terrorists have started exchanging information using these methods. Since communication is encrypted end-to-end, tracing significant forensic details and actual content of messages are found to be unattended and severe challenges by available forensic tools. These challenges seriously affect in procuring substantial evidences against such criminals from their working environments. This paper presents a vibrant forensic exploration and architectural framework which not only decrypts any communication or network session but also reconstructs actual message contents of email as well as instant messaging applications. The framework can be effectively used in proxy servers and individual computers and it aims to perform forensic reconstruction followed by analysis of webmail and ICQ messaging applications. This forensic framework exhibits a versatile nature as it is equipped with high speed packet capturing hardware, a well-designed packet manipulating algorithm. It regenerates message contents over regular as well as SSL encrypted SMTP, POP3 and IMAP protocols and catalyzes forensic presentation procedure for prosecution of cyber criminals by producing solid evidences of their actual communication as per court of law of specific countries.Keywords: forensics, network sessions, packet reconstruction, packet reordering
Procedia PDF Downloads 344287 Design of an Acoustic Imaging Sensor Array for Mobile Robots
Authors: Dibyendu Roy, V. Ramu Reddy, Parijat Deshpande, Ranjan Dasgupta
Abstract:
Imaging of underwater objects is primarily conducted by acoustic imagery due to the severe attenuation of electro-magnetic waves in water. Acoustic imagery underwater has varied range of significant applications such as side-scan sonar, mine hunting sonar. It also finds utility in other domains such as imaging of body tissues via ultrasonography and non-destructive testing of objects. In this paper, we explore the feasibility of using active acoustic imagery in air and simulate phased array beamforming techniques available in literature for various array designs to achieve a suitable acoustic sensor array design for a portable mobile robot which can be applied to detect the presence/absence of anomalous objects in a room. The multi-path reflection effects especially in enclosed rooms and environmental noise factors are currently not simulated and will be dealt with during the experimental phase. The related hardware is designed with the same feasibility criterion that the developed system needs to be deployed on a portable mobile robot. There is a trade of between image resolution and range with the array size, number of elements and the imaging frequency and has to be iteratively simulated to achieve the desired acoustic sensor array design. The designed acoustic imaging array system is to be mounted on a portable mobile robot and targeted for use in surveillance missions for intruder alerts and imaging objects during dark and smoky scenarios where conventional optic based systems do not function well.Keywords: acoustic sensor array, acoustic imagery, anomaly detection, phased array beamforming
Procedia PDF Downloads 409286 Linux Security Management: Research and Discussion on Problems Caused by Different Aspects
Authors: Ma Yuzhe, Burra Venkata Durga Kumar
Abstract:
The computer is a great invention. As people use computers more and more frequently, the demand for PCs is growing, and the performance of computer hardware is also rising to face more complex processing and operation. However, the operating system, which provides the soul for computers, has stopped developing at a stage. In the face of the high price of UNIX (Uniplexed Information and Computering System), batch after batch of personal computer owners can only give up. Disk Operating System is too simple and difficult to bring innovation into play, which is not a good choice. And MacOS is a special operating system for Apple computers, and it can not be widely used on personal computers. In this environment, Linux, based on the UNIX system, was born. Linux combines the advantages of the operating system and is composed of many microkernels, which is relatively powerful in the core architecture. Linux system supports all Internet protocols, so it has very good network functions. Linux supports multiple users. Each user has no influence on their own files. Linux can also multitask and run different programs independently at the same time. Linux is a completely open source operating system. Users can obtain and modify the source code for free. Because of these advantages of Linux, it has also attracted a large number of users and programmers. The Linux system is also constantly upgraded and improved. It has also issued many different versions, which are suitable for community use and commercial use. Linux system has good security because it relies on a file partition system. However, due to the constant updating of vulnerabilities and hazards, the using security of the operating system also needs to be paid more attention to. This article will focus on the analysis and discussion of Linux security issues.Keywords: Linux, operating system, system management, security
Procedia PDF Downloads 111285 A 3D Bioprinting System for Engineering Cell-Embedded Hydrogels by Digital Light Processing
Authors: Jimmy Jiun-Ming Su, Yuan-Min Lin
Abstract:
Bioprinting has been applied to produce 3D cellular constructs for tissue engineering. Microextrusion printing is the most common used method. However, printing low viscosity bioink is a challenge for this method. Herein, we developed a new 3D printing system to fabricate cell-laden hydrogels via a DLP-based projector. The bioprinter is assembled from affordable equipment including a stepper motor, screw, LED-based DLP projector, open source computer hardware and software. The system can use low viscosity and photo-polymerized bioink to fabricate 3D tissue mimics in a layer-by-layer manner. In this study, we used gelatin methylacrylate (GelMA) as bioink for stem cell encapsulation. In order to reinforce the printed construct, surface modified hydroxyapatite has been added in the bioink. We demonstrated the silanization of hydroxyapatite could improve the crosslinking between the interface of hydroxyapatite and GelMA. The results showed that the incorporation of silanized hydroxyapatite into the bioink had an enhancing effect on the mechanical properties of printed hydrogel, in addition, the hydrogel had low cytotoxicity and promoted the differentiation of embedded human bone marrow stem cells (hBMSCs) and retinal pigment epithelium (RPE) cells. Moreover, this bioprinting system has the ability to generate microchannels inside the engineered tissues to facilitate diffusion of nutrients. We believe this 3D bioprinting system has potential to fabricate various tissues for clinical applications and regenerative medicine in the future.Keywords: bioprinting, cell encapsulation, digital light processing, GelMA hydrogel
Procedia PDF Downloads 182284 Setting Uncertainty Conditions Using Singular Values for Repetitive Control in State Feedback
Authors: Muhammad A. Alsubaie, Mubarak K. H. Alhajri, Tarek S. Altowaim
Abstract:
A repetitive controller designed to accommodate periodic disturbances via state feedback is discussed. Periodic disturbances can be represented by a time delay model in a positive feedback loop acting on system output. A direct use of the small gain theorem solves the periodic disturbances problem via 1) isolating the delay model, 2) finding the overall system representation around the delay model and 3) designing a feedback controller that assures overall system stability and tracking error convergence. This paper addresses uncertainty conditions for the repetitive controller designed in state feedback in either past error feedforward or current error feedback using singular values. The uncertainty investigation is based on the overall system found and the stability condition associated with it; depending on the scheme used, to set an upper/lower limit weighting parameter. This creates a region that should not be exceeded in selecting the weighting parameter which in turns assures performance improvement against system uncertainty. Repetitive control problem can be described in lifted form. This allows the usage of singular values principle in setting the range for the weighting parameter selection. The Simulation results obtained show a tracking error convergence against dynamic system perturbation if the weighting parameter chosen is within the range obtained. Simulation results also show the advantage of weighting parameter usage compared to the case where it is omitted.Keywords: model mismatch, repetitive control, singular values, state feedback
Procedia PDF Downloads 156283 Optimization of Reliability Test Plans: Increase Wafer Fabrication Equipments Uptime
Authors: Swajeeth Panchangam, Arun Rajendran, Swarnim Gupta, Ahmed Zeouita
Abstract:
Semiconductor processing chambers tend to operate in controlled but aggressive operating conditions (chemistry, plasma, high temperature etc.) Owing to this, the design of this equipment requires developing robust and reliable hardware and software. Any equipment downtime due to reliability issues can have cost implications both for customers in terms of tool downtime (reduced throughput) and for equipment manufacturers in terms of high warranty costs and customer trust deficit. A thorough reliability assessment of critical parts and a plan for preventive maintenance/replacement schedules need to be done before tool shipment. This helps to save significant warranty costs and tool downtimes in the field. However, designing a proper reliability test plan to accurately demonstrate reliability targets with proper sample size and test duration is quite challenging. This is mainly because components can fail in different failure modes that fit into different Weibull beta value distributions. Without apriori Weibull beta of a failure mode under consideration, it always leads to over/under utilization of resources, which eventually end up in false positives or false negatives estimates. This paper proposes a methodology to design a reliability test plan with optimal model size/duration/both (independent of apriori Weibull beta). This methodology can be used in demonstration tests and can be extended to accelerated life tests to further decrease sample size/test duration.Keywords: reliability, stochastics, preventive maintenance
Procedia PDF Downloads 17282 The DAQ Debugger for iFDAQ of the COMPASS Experiment
Authors: Y. Bai, M. Bodlak, V. Frolov, S. Huber, V. Jary, I. Konorov, D. Levit, J. Novy, D. Steffen, O. Subrt, M. Virius
Abstract:
In general, state-of-the-art Data Acquisition Systems (DAQ) in high energy physics experiments must satisfy high requirements in terms of reliability, efficiency and data rate capability. This paper presents the development and deployment of a debugging tool named DAQ Debugger for the intelligent, FPGA-based Data Acquisition System (iFDAQ) of the COMPASS experiment at CERN. Utilizing a hardware event builder, the iFDAQ is designed to be able to readout data at the average maximum rate of 1.5 GB/s of the experiment. In complex softwares, such as the iFDAQ, having thousands of lines of code, the debugging process is absolutely essential to reveal all software issues. Unfortunately, conventional debugging of the iFDAQ is not possible during the real data taking. The DAQ Debugger is a tool for identifying a problem, isolating the source of the problem, and then either correcting the problem or determining a way to work around it. It provides the layer for an easy integration to any process and has no impact on the process performance. Based on handling of system signals, the DAQ Debugger represents an alternative to conventional debuggers provided by most integrated development environments. Whenever problem occurs, it generates reports containing all necessary information important for a deeper investigation and analysis. The DAQ Debugger was fully incorporated to all processes in the iFDAQ during the run 2016. It helped to reveal remaining software issues and improved significantly the stability of the system in comparison with the previous run. In the paper, we present the DAQ Debugger from several insights and discuss it in a detailed way.Keywords: DAQ Debugger, data acquisition system, FPGA, system signals, Qt framework
Procedia PDF Downloads 284281 Color Image Compression/Encryption/Contour Extraction using 3L-DWT and SSPCE Method
Authors: Ali A. Ukasha, Majdi F. Elbireki, Mohammad F. Abdullah
Abstract:
Data security needed in data transmission, storage, and communication to ensure the security. This paper is divided into two parts. This work interests with the color image which is decomposed into red, green and blue channels. The blue and green channels are compressed using 3-levels discrete wavelet transform. The Arnold transform uses to changes the locations of red image channel pixels as image scrambling process. Then all these channels are encrypted separately using the key image that has same original size and are generating using private keys and modulo operations. Performing the X-OR and modulo operations between the encrypted channels images for image pixel values change purpose. The extracted contours from color images recovery can be obtained with accepted level of distortion using single step parallel contour extraction (SSPCE) method. Experiments have demonstrated that proposed algorithm can fully encrypt 2D Color images and completely reconstructed without any distortion. Also shown that the analyzed algorithm has extremely large security against some attacks like salt and pepper and Jpeg compression. Its proof that the color images can be protected with a higher security level. The presented method has easy hardware implementation and suitable for multimedia protection in real time applications such as wireless networks and mobile phone services.Keywords: SSPCE method, image compression and salt and peppers attacks, bitplanes decomposition, Arnold transform, color image, wavelet transform, lossless image encryption
Procedia PDF Downloads 520