Search results for: time consuming
17263 Implementation of Conceptual Real-Time Embedded Functional Design via Drive-By-Wire ECU Development
Authors: Ananchai Ukaew, Choopong Chauypen
Abstract:
Design concepts of real-time embedded system can be realized initially by introducing novel design approaches. In this literature, model based design approach and in-the-loop testing were employed early in the conceptual and preliminary phase to formulate design requirements and perform quick real-time verification. The design and analysis methodology includes simulation analysis, model based testing, and in-the-loop testing. The design of conceptual drive-by-wire, or DBW, algorithm for electronic control unit, or ECU, was presented to demonstrate the conceptual design process, analysis, and functionality evaluation. The concepts of DBW ECU function can be implemented in the vehicle system to improve electric vehicle, or EV, conversion drivability. However, within a new development process, conceptual ECU functions and parameters are needed to be evaluated. As a result, the testing system was employed to support conceptual DBW ECU functions evaluation. For the current setup, the system components were consisted of actual DBW ECU hardware, electric vehicle models, and control area network or CAN protocol. The vehicle models and CAN bus interface were both implemented as real-time applications where ECU and CAN protocol functionality were verified according to the design requirements. The proposed system could potentially benefit in performing rapid real-time analysis of design parameters for conceptual system or software algorithm development.Keywords: drive-by-wire ECU, in-the-loop testing, model-based design, real-time embedded system
Procedia PDF Downloads 34817262 Polyacrylates in Poly (Lactic Acid) Matrix, New Biobased Polymer Material
Authors: Irena Vuković-Kwiatkowska, Halina Kaczmarek
Abstract:
Poly (lactic acid) is well known polymer, often called green material because of its origin (renewable resources) and biodegradability. This biopolymer can be used in the packaging industry very often. Poor resistance to permeation of gases is the disadvantage of poly (lactic acid). The permeability of gases and vapor through the films applied for packages and bottles generally should be very low to prolong products shelf-life. We propose innovation method of PLA gas barrier modification using electromagnetic radiation in ultraviolet range. Poly (lactic acid) (PLA) and multifunctional acrylate monomers were mixed in different composition. Final films were obtained by photochemical reaction (photocrosslinking). We tested permeability to water vapor and carbon dioxide through these films. Also their resistance to UV radiation was also studied. The samples were conditioned in the activated sludge and in the natural soil to test their biodegradability. An innovative method of PLA modification allows to expand its usage, and can reduce the future costs of waste management what is the result of consuming such materials like PET and HDPE. Implementation of our material for packaging will contribute to the protection of the environment from the harmful effects of extremely difficult to biodegrade materials made from PET or other plasticKeywords: interpenetrating polymer network, packaging films, photocrosslinking, polyacrylates dipentaerythritol pentaacrylate DPEPA, poly (lactic acid), polymer biodegradation
Procedia PDF Downloads 47717261 Mathematical Modeling and Analysis of Forced Vibrations in Micro-Scale Microstretch Thermoelastic Simply Supported Beam
Authors: Geeta Partap, Nitika Chugh
Abstract:
The present paper deals with the flexural vibrations of homogeneous, isotropic, generalized micropolar microstretch thermoelastic thin Euler-Bernoulli beam resonators, due to Exponential time varying load. Both the axial ends of the beam are assumed to be at simply supported conditions. The governing equations have been solved analytically by using Laplace transforms technique twice with respect to time and space variables respectively. The inversion of Laplace transform in time domain has been performed by using the calculus of residues to obtain deflection.The analytical results have been numerically analyzed with the help of MATLAB software for magnesium like material. The graphical representations and interpretations have been discussed for Deflection of beam under Simply Supported boundary condition and for distinct considered values of time and space as well. The obtained results are easy to implement for engineering analysis and designs of resonators (sensors), modulators, actuators.Keywords: microstretch, deflection, exponential load, Laplace transforms, residue theorem, simply supported
Procedia PDF Downloads 30717260 Synthesis of Zeolites from Bauxite and Kaolin: Effect of Synthesis Parameters on Competing Phases
Authors: Bright Kwakye-Awuah, Elizabeth Von-Kiti, Isaac Nkrumah, Baah Sefa-Ntiri, Craig D. Williams
Abstract:
Bauxite and kaolin from Ghana Bauxite Company mine site were used to synthesize zeolites. Bauxite served as the alumina source and kaolin the silica source. Synthesis variations include variation of aging time at constant crystallization time and variation of crystallization times at constant aging time. Characterization techniques such as X-ray diffraction (XRD), scanning electron microscopy (SEM), energy dispersive x-ray analysis (EDX) and Fourier transform infrared spectroscopy (FTIR) were employed in the characterization of the raw samples as well as the synthesized samples. The results obtained showed that the transformations that occurred and the phase of the resulting products were coordinated by the aging time, crystallization time, alkaline concentration and Si/Al ratio of the system. Zeolites A, X, Y, analcime, Sodalite, and ZK-14 were some of the phases achieved. Zeolite LTA was achieved with short crystallization times of 3, 5, 18 and 24 hours and a maximum aging of 24 hours. Zeolite LSX was synthesized with 24 hr aging followed with 24 hr hydrothermal treatment whilst zeolite Y crystallized after 48 hr of aging and 24 hr crystallization. Prolonged crystallization time produced a mixed phased product. Prolonged aging times, on the other hand, did not yield any zeolite as the sample was amorphous. Increasing the alkaline content of the reaction mixture above 5M introduced sodalite phase in the final product. The properties of the final products were comparable to zeolites synthesized from pure chemical reagents.Keywords: bauxite, kaolin, aging, crystallization, zeolites
Procedia PDF Downloads 21817259 Educating Children with the Child-Friendly Smartphone Operation System
Authors: Wildan Maulana Wildan, Siti Annisa Rahmayani Icha
Abstract:
Nowadays advances in information technology are needed by all the inhabitants of the earth for the sake of ease all their work, but it is worth to introduced the technological advances in the world of children. Before the technology is growing rapidly, children busy with various of traditional games and have high socialization. Moreover, after it presence, almost all of children spend more their time for playing gadget, It can affect the education of children and will change the character and personality children. However, children also can not be separated with the technology. Because the technology insight knowledge of children will be more extensive. Because the world can not be separated with advances in technology as well as with children, there should be developed a smartphone operating system that is child-friendly. The operating system is able to filter contents that do not deserve children, even in this system there is a reminder of a time study, prayer time and play time for children and there are interactive contents that will help the development of education and children's character. Children need technology, and there are some ways to introduce it to children. We must look at the characteristics of children in different environments. Thus advances in technology can be beneficial to the world children and their parents, and educators do not have to worry about advances in technology. We should be able to take advantage of advances in technology best possible.Keywords: information technology, smartphone operating system, education, character
Procedia PDF Downloads 51117258 Mixing Time: Influence on the Compressive Strength
Authors: J. Alvarez Muñoz, Dominguez Lepe J. A.
Abstract:
A suitable mixing time of the concrete, allows form a homogeneous mass, quality that leads to greater compressive strength and durability. Although there are recommendations as ASTM C94 standard these mention the time and the number of minimum and maximum speed for a ready-mix concrete of good quality, the specific behavior that would have a concrete mixed on site to variability of the mixing time is unknown. In this study was evaluated the behavior a design of mixture structural of f´c=250 kg/cm2, elaborate on site with limestone aggregate in warm sub-humid climate, subjected to different mixing times. Based on the recommendation for ready-mixed concrete ASTM C94, different times were set at 70, 90, 100, 110, 120, 140 total revolutions. A field study in which 14 works were observed where structural concrete made on site was used, allowed to set at 24 the number of revolutions to the reference mixture. For the production of concrete was used a hand feed concrete mixer with drum speed 28 RPM, the ratio w/c was 0.36 corrected, with a slump of 5-6 cm, for all mixtures. The compressive strength tests were performed at 3, 7, 14, and 28 days. The most outstanding results show increases in resistance in the mixtures of 24 to 70 revolutions between 8 and 17 percent and 70 to 90 revolutions of 3 to 8 percent. Increasing the number of revolutions at 110, 120 and 140, there was a reduction of the compressive strength of 0.5 to 8 percent. Regarding mixtures consistencies, they had a slump of 5 cm to 24, 70 and 90 rpm and less than 5 cm from 100 revolutions. Clearly, those made with more than 100 revolutions mixtures not only decrease the compressive strength but also the workability.Keywords: compressive strength, concrete, mixing time, workability
Procedia PDF Downloads 39517257 R Software for Parameter Estimation of Spatio-Temporal Model
Authors: Budi Nurani Ruchjana, Atje Setiawan Abdullah, I. Gede Nyoman Mindra Jaya, Eddy Hermawan
Abstract:
In this paper, we propose the application package to estimate parameters of spatiotemporal model based on the multivariate time series analysis using the R open-source software. We build packages mainly to estimate the parameters of the Generalized Space Time Autoregressive (GSTAR) model. GSTAR is a combination of time series and spatial models that have parameters vary per location. We use the method of Ordinary Least Squares (OLS) and use the Mean Average Percentage Error (MAPE) to fit the model to spatiotemporal real phenomenon. For case study, we use oil production data from volcanic layer at Jatibarang Indonesia or climate data such as rainfall in Indonesia. Software R is very user-friendly and it is making calculation easier, processing the data is accurate and faster. Limitations R script for the estimation of model parameters spatiotemporal GSTAR built is still limited to a stationary time series model. Therefore, the R program under windows can be developed either for theoretical studies and application.Keywords: GSTAR Model, MAPE, OLS method, oil production, R software
Procedia PDF Downloads 24017256 Modeling a Closed Loop Supply Chain with Continuous Price Decrease and Dynamic Deterministic Demand
Authors: H. R. Kamali, A. Sadegheih, M. A. Vahdat-Zad, H. Khademi-Zare
Abstract:
In this paper, a single product, multi-echelon, multi-period closed loop supply chain is surveyed, including a variety of costs, time conditions, and capacities, to plan and determine the values and time of the components procurement, production, distribution, recycling and disposal specially for high-tech products that undergo a decreasing production cost and sale price over time. For this purpose, the mathematic model of the problem that is a kind of mixed integer linear programming is presented, and it is finally proved that the problem belongs to the category of NP-hard problems.Keywords: closed loop supply chain, continuous price decrease, NP-hard, planning
Procedia PDF Downloads 36217255 Quality Evaluation of Backfill Grout in Tunnel Boring Machine Tail Void Using Impact-Echo (IE): Short-Time Fourier Transform (STFT) Numerical Analysis
Authors: Ju-Young Choi, Ki-Il Song, Kyoung-Yul Kim
Abstract:
During Tunnel Boring Machine (TBM) tunnel excavation, backfill grout should be injected after the installation of segment lining to ensure the stability of the tunnel and to minimize ground deformation. If grouting is not sufficient to fill the gap between the segments and rock mass, hydraulic pressures occur in the void, which can negatively influence the stability of the tunnel. Recently the tendency to use TBM tunnelling method to replace the drill and blast(NATM) method is increasing. However, there are only a few studies of evaluation of backfill grout. This study evaluates the TBM tunnel backfill state using Impact-Echo(IE). 3-layers, segment-grout-rock mass, are simulated by FLAC 2D, FDM-based software. The signals obtained from numerical analysis and IE test are analyzed by Short-Time Fourier Transform(STFT) in time domain, frequency domain, and time-frequency domain. The result of this study can be used to evaluate the quality of backfill grouting in tail void.Keywords: tunnel boring machine, backfill grout, impact-echo method, time-frequency domain analysis, finite difference method
Procedia PDF Downloads 26417254 A Machine Learning Pipeline for Real-Time Activity Detection on Low Computational Power Devices for Metaverse Applications
Authors: Amit Kumar, Amanpreet Chander, Ashish Sahani
Abstract:
This paper presents our recent work on real-time human activity detection based on the media pipe pipeline and machine learning algorithms. The proposed system can detect human activities, including running, jumping, squatting, bending to the left or right, and standing still. This is a robust solution for developing a yoga, dance, metaverse, and fitness application that checks for the correction of the pose without having any additional monitor like a personal trainer. MediaPipe solution offers an open-source cross-platform which utilizes a two-step detector-tracker ML pipeline for live detection of key landmarks on our body which can be used for motion data collection. The prediction of real-time poses uses a variety of machine learning techniques and different types of analysis. Without primarily relying on powerful desktop environments for inference, our method achieves real-time performance on the majority of contemporary mobile phones, desktops/laptops, Python, or even the web. Experimental results show that our method outperforms the existing method in terms of accuracy and real-time capability, achieving an accuracy of 99.92% on testing datasets.Keywords: human activity detection, media pipe, machine learning, metaverse applications
Procedia PDF Downloads 17717253 Visualization of Energy Waves via Airy Functions in Time-Domain
Authors: E. Sener, O. Isik, E. Eroglu, U. Sahin
Abstract:
The main idea is to solve the system of Maxwell’s equations in accordance with the causality principle to get the energy quantities via Airy functions in a hollow rectangular waveguide. We used the evolutionary approach to electromagnetics that is an analytical time-domain method. The boundary-value problem for the system of Maxwell’s equations is reformulated in transverse and longitudinal coordinates. A self-adjoint operator is obtained and the complete set of Eigen vectors of the operator initiates an orthonormal basis of the solution space. Hence, the sought electromagnetic field can be presented in terms of this basis. Within the presentation, the scalar coefficients are governed by Klein-Gordon equation. Ultimately, in this study, time-domain waveguide problem is solved analytically in accordance with the causality principle. Moreover, the graphical results are visualized for the case when the energy and surplus of the energy for the time-domain waveguide modes are represented via airy functions.Keywords: airy functions, Klein-Gordon Equation, Maxwell’s equations, Surplus of energy, wave boundary operators
Procedia PDF Downloads 36917252 Economic Development Process: A Compartmental Analysis of a Model with Two Delays
Authors: Amadou Banda Ndione, Charles Awono Onana
Abstract:
In this paper the compartmental approach is applied to build a macroeconomic model characterized by countries. We consider a total of N countries that are subdivided into three compartments according to their economic status: D(t) denotes the compartment of developing countries at time t, E(t) stands for the compartment of emerging countries at time t while A(t) represents advanced countries at time t. The model describes the process of economic development and includes the notion of openness through collaborations between countries. Two delays appear in this model to describe the average time necessary for collaborations between countries to become efficient for their development process. Our model represents the different stages of development. It further gives the conditions under which a country can change its economic status and demonstrates the short-term positive effect of openness on economic growth. In addition, we investigate bifurcation by considering the delay as a bifurcation parameter and examine the onset and termination of Hopf bifurcations from a positive equilibrium. Numerical simulations are provided in order to illustrate the theoretical part and to support discussion.Keywords: compartmental systems, delayed dynamical system, economic development, fiscal policy, hopf bifurcation
Procedia PDF Downloads 13617251 Information Technology Pattern for Traceability to Increase the Exporting Efficiency of Thailand’s Orchid
Authors: Pimploi Tirastittam, Phutthiwat Waiyawuththanapoom, Manop Tirastittam
Abstract:
Traceability system is one of the tools which can ensure the product’s confident of the consumer as it can trace the product back to its origin and can reduce the operation cost of recall. Nowadays, there are so many technologies which can be applied to the traceability system and also able to increase the efficiency of the system such as QR Code, barcode, GS1 and GTIN. As the result, this research is aimed to study and design the information technology pattern that suits for the traceability of Thailand’s orchid because Thailand’s orchid is the popular export product for Japan, USA, China, Netherlands and Italy. This study will enhance the value of Thailand’s orchid and able to prevent the unexpected event of the defects or damaged product. The traceability pattern was received IOC test from 12 experts from 4 fields of study which are traceability field, information technology field, information communication technology field and orchid export field. The result of the in-depth interview and questionnaire showed that the technology which most compatibility with the traceability system is the QR code. The mean of the score was 4.25 and the standard deviation was 0.5 as the QR code is the new technology and user-friendly. The traceability system should start from the farm to the consumer in the consuming country as the traceability system will enhance the quality level of the product and increase the value of its as well. The other outcome from this research is the supply chain model of Thailand’s Orchid along with the system architecture and working system diagram.Keywords: exporting, information technology pattern, orchid, traceability
Procedia PDF Downloads 22417250 Relationships between Screen Time, Internet Addiction and Other Lifestyle Behaviors with Obesity among Secondary School Students in the Turkish Republic of Northern Cyprus
Authors: Ozen Asut, Gulifeiya Abuduxike, Imge Begendi, Mustafa O. Canatan, Merve Colak, Gizem Ozturk, Lara Tasan, Ahmed Waraiet, Songul A. Vaizoglu, Sanda Cali
Abstract:
Obesity among children and adolescents is one of the critical public health problems worldwide. Internet addiction is one of the sedentary behaviors that cause obesity due to the excessive screen time and reduced physical activities. We aimed to examine the relationships between the screen time, internet addiction and other lifestyle behaviors with obesity among high school students in the Near East College in Nicosia, Northern Cyprus. A cross-sectional study conducted among 469 secondary school students, mean age 11.95 (SD, 0.81) years. A self-administrated questionnaire was applied to assess the screen time and lifestyle behaviors. The Turkish adopted version of short-form of internet addiction test was used to assess internet addiction problems. Height and weight were measured to calculate BMI and classified based on the BMI percentiles for sex and age. Descriptive analysis, Chi-Square test, and multivariate regression analysis were done. Of all, 17.2% of the participants were overweight and obese, and 18.1% had internet addictions, while 40.7% of them reported having screen time more than two hours. After adjusting the analysis for age and sex, eating snacks while watching television (OR, 3.04; 95% CI, 1.28-7.21), self- perceived body weight (OR, 24.9; 95% CI, 9.64-64.25) and having a play station in the room (OR, 4.6; 95% CI, 1.85 - 11.42) were significantly associated with obesity. Screen time (OR, 4.68; 95% CI, 2.61-8.38; p=0.000) and having a computer in bedroom (OR, 1.7; 95% CI, 1.01- 2.87; p=0.046) were significantly associated with internet addiction, whereas parent’s compliant regarding the lengthy technology use (OR, 0.23; 95% CI, 0.11-0.46; p=0.000) was found to be a protective factor against internet addiction. Prolonged screen time, internet addiction, sedentary lifestyles, and reduced physical and social activities are interrelated, multi-dimensional factors that lead to obesity among children and adolescents. A family - school-based integrated approach should be implemented to tackle obesity problems.Keywords: adolescents, internet addiction, lifestyle, Northern Cyprus, obesity, screen time
Procedia PDF Downloads 14217249 Hardy Type Inequalities of Two-Dimensional on Time Scales via Steklov Operator
Authors: Wedad Albalawi
Abstract:
The mathematical inequalities have been the core of mathematical study and used in almost all branches of mathematics as well in various areas of science and engineering. The inequalities by Hardy, Littlewood and Polya were the first significant composition of several science. This work presents fundamental ideas, results and techniques and it has had much influence on research in various branches of analysis. Since 1934, various inequalities have been produced and studied in the literature. Furthermore, some inequalities have been formulated by some operators; in 1989, weighted Hardy inequalities have been obtained for integration operators. Then, they obtained weighted estimates for Steklov operators that were used in the solution of the Cauchy problem for the wave equation. They were improved upon in 2011 to include the boundedness of integral operators from the weighted Sobolev space to the weighted Lebesgue space. Some inequalities have been demonstrated and improved using the Hardy–Steklov operator. Recently, a lot of integral inequalities have been improved by differential operators. Hardy inequality has been one of the tools that is used to consider integrity solutions of differential equations. Then dynamic inequalities of Hardy and Coposon have been extended and improved by various integral operators. These inequalities would be interesting to apply in different fields of mathematics (functional spaces, partial differential equations, mathematical modeling). Some inequalities have been appeared involving Copson and Hardy inequalities on time scales to obtain new special version of them. A time scale is defined as a closed subset contains real numbers. Then the inequalities of time scales version have received a lot of attention and has had a major field in both pure and applied mathematics. There are many applications of dynamic equations on time scales to quantum mechanics, electrical engineering, neural networks, heat transfer, combinatorics, and population dynamics. This study focuses on double integrals to obtain new time-scale inequalities of Copson driven by Steklov operator. They will be applied in the solution of the Cauchy problem for the wave equation. The proof can be done by introducing restriction on the operator in several cases. In addition, the obtained inequalities done by using some concepts in time scale version such as time scales calculus, theorem of Fubini and the inequality of H¨older.Keywords: time scales, inequality of Hardy, inequality of Coposon, Steklov operator
Procedia PDF Downloads 7617248 Multi-Labeled Aromatic Medicinal Plant Image Classification Using Deep Learning
Authors: Tsega Asresa, Getahun Tigistu, Melaku Bayih
Abstract:
Computer vision is a subfield of artificial intelligence that allows computers and systems to extract meaning from digital images and video. It is used in a wide range of fields of study, including self-driving cars, video surveillance, medical diagnosis, manufacturing, law, agriculture, quality control, health care, facial recognition, and military applications. Aromatic medicinal plants are botanical raw materials used in cosmetics, medicines, health foods, essential oils, decoration, cleaning, and other natural health products for therapeutic and Aromatic culinary purposes. These plants and their products not only serve as a valuable source of income for farmers and entrepreneurs but also going to export for valuable foreign currency exchange. In Ethiopia, there is a lack of technologies for the classification and identification of Aromatic medicinal plant parts and disease type cured by aromatic medicinal plants. Farmers, industry personnel, academicians, and pharmacists find it difficult to identify plant parts and disease types cured by plants before ingredient extraction in the laboratory. Manual plant identification is a time-consuming, labor-intensive, and lengthy process. To alleviate these challenges, few studies have been conducted in the area to address these issues. One way to overcome these problems is to develop a deep learning model for efficient identification of Aromatic medicinal plant parts with their corresponding disease type. The objective of the proposed study is to identify the aromatic medicinal plant parts and their disease type classification using computer vision technology. Therefore, this research initiated a model for the classification of aromatic medicinal plant parts and their disease type by exploring computer vision technology. Morphological characteristics are still the most important tools for the identification of plants. Leaves are the most widely used parts of plants besides roots, flowers, fruits, and latex. For this study, the researcher used RGB leaf images with a size of 128x128 x3. In this study, the researchers trained five cutting-edge models: convolutional neural network, Inception V3, Residual Neural Network, Mobile Network, and Visual Geometry Group. Those models were chosen after a comprehensive review of the best-performing models. The 80/20 percentage split is used to evaluate the model, and classification metrics are used to compare models. The pre-trained Inception V3 model outperforms well, with training and validation accuracy of 99.8% and 98.7%, respectively.Keywords: aromatic medicinal plant, computer vision, convolutional neural network, deep learning, plant classification, residual neural network
Procedia PDF Downloads 18517247 A Comprehensive Finite Element Model for Incremental Launching of Bridges: Optimizing Construction and Design
Authors: Mohammad Bagher Anvari, Arman Shojaei
Abstract:
Incremental launching, a widely adopted bridge erection technique, offers numerous advantages for bridge designers. However, accurately simulating and modeling the dynamic behavior of the bridge during each step of the launching process proves to be tedious and time-consuming. The perpetual variation of internal forces within the deck during construction stages adds complexity, exacerbated further by considerations of other load cases, such as support settlements and temperature effects. As a result, there is an urgent need for a reliable, simple, economical, and fast algorithmic solution to model bridge construction stages effectively. This paper presents a novel Finite Element (FE) model that focuses on studying the static behavior of bridges during the launching process. Additionally, a simple method is introduced to normalize all quantities in the problem. The new FE model overcomes the limitations of previous models, enabling the simulation of all stages of launching, which conventional models fail to achieve due to underlying assumptions. By leveraging the results obtained from the new FE model, this study proposes solutions to improve the accuracy of conventional models, particularly for the initial stages of bridge construction that have been neglected in previous research. The research highlights the critical role played by the first span of the bridge during the initial stages, a factor often overlooked in existing studies. Furthermore, a new and simplified model termed the "semi-infinite beam" model, is developed to address this oversight. By utilizing this model alongside a simple optimization approach, optimal values for launching nose specifications are derived. The practical applications of this study extend to optimizing the nose-deck system of incrementally launched bridges, providing valuable insights for practical usage. In conclusion, this paper introduces a comprehensive Finite Element model for studying the static behavior of bridges during incremental launching. The proposed model addresses limitations found in previous approaches and offers practical solutions to enhance accuracy. The study emphasizes the importance of considering the initial stages and introduces the "semi-infinite beam" model. Through the developed model and optimization approach, optimal specifications for launching nose configurations are determined. This research holds significant practical implications and contributes to the optimization of incrementally launched bridges, benefiting both the construction industry and bridge designers.Keywords: incremental launching, bridge construction, finite element model, optimization
Procedia PDF Downloads 9917246 A Quantitative Study of the Evolution of Open Source Software Communities
Authors: M. R. Martinez-Torres, S. L. Toral, M. Olmedilla
Abstract:
Typically, virtual communities exhibit the well-known phenomenon of participation inequality, which means that only a small percentage of users is responsible of the majority of contributions. However, the sustainability of the community requires that the group of active users must be continuously nurtured with new users that gain expertise through a participation process. This paper analyzes the time evolution of Open Source Software (OSS) communities, considering users that join/abandon the community over time and several topological properties of the network when modeled as a social network. More specifically, the paper analyzes the role of those users rejoining the community and their influence in the global characteristics of the network.Keywords: open source communities, social network Analysis, time series, virtual communities
Procedia PDF Downloads 52117245 Agile Real-Time Field Programmable Gate Array-Based Image Processing System for Drone Imagery in Digital Agriculture
Authors: Sabiha Shahid Antora, Young Ki Chang
Abstract:
Along with various farm management technologies, imagery is an important tool that facilitates crop assessment, monitoring, and management. As a consequence, drone imaging technology is playing a vital role to capture the state of the entire field for yield mapping, crop scouting, weed detection, and so on. Although it is essential to inspect the cultivable lands in real-time for making rapid decisions regarding field variable inputs to combat stresses and diseases, drone imagery is still evolving in this area of interest. Cost margin and post-processing complexions of the image stream are the main challenges of imaging technology. Therefore, this proposed project involves the cost-effective field programmable gate array (FPGA) based image processing device that would process the image stream in real-time as well as providing the processed output to support on-the-spot decisions in the crop field. As a result, the real-time FPGA-based image processing system would reduce operating costs while minimizing a few intermediate steps to deliver scalable field decisions.Keywords: real-time, FPGA, drone imagery, image processing, crop monitoring
Procedia PDF Downloads 11017244 Getting to Know the Enemy: Utilization of Phone Record Analysis Simulations to Uncover a Target’s Personal Life Attributes
Authors: David S. Byrne
Abstract:
The purpose of this paper is to understand how phone record analysis can enable identification of subjects in communication with a target of a terrorist plot. This study also sought to understand the advantages of the implementation of simulations to develop the skills of future intelligence analysts to enhance national security. Through the examination of phone reports which in essence consist of the call traffic of incoming and outgoing numbers (and not by listening to calls or reading the content of text messages), patterns can be uncovered that point toward members of a criminal group and activities planned. Through temporal and frequency analysis, conclusions were drawn to offer insights into the identity of participants and the potential scheme being undertaken. The challenge lies in the accurate identification of the users of the phones in contact with the target. Often investigators rely on proprietary databases and open sources to accomplish this task, however it is difficult to ascertain the accuracy of the information found. Thus, this paper poses two research questions: how effective are freely available web sources of information at determining the actual identification of callers? Secondly, does the identity of the callers enable an understanding of the lifestyle and habits of the target? The methodology for this research consisted of the analysis of the call detail records of the author’s personal phone activity spanning the period of a year combined with a hypothetical theory that the owner of said phone was a leader of terrorist cell. The goal was to reveal the identity of his accomplices and understand how his personal attributes can further paint a picture of the target’s intentions. The results of the study were interesting, nearly 80% of the calls were identified with over a 75% accuracy rating via datamining of open sources. The suspected terrorist’s inner circle was recognized including relatives and potential collaborators as well as financial institutions [money laundering], restaurants [meetings], a sporting goods store [purchase of supplies], and airline and hotels [travel itinerary]. The outcome of this research showed the benefits of cellphone analysis without more intrusive and time-consuming methodologies though it may be instrumental for potential surveillance, interviews, and developing probable cause for wiretaps. Furthermore, this research highlights the importance of building upon the skills of future intelligence analysts through phone record analysis via simulations; that hands-on learning in this case study emphasizes the development of the competencies necessary to improve investigations overall.Keywords: hands-on learning, intelligence analysis, intelligence education, phone record analysis, simulations
Procedia PDF Downloads 1417243 The Relationship between Rhythmic Complexity and Listening Engagement as a Proxy for Perceptual Interest
Authors: Noah R. Fram
Abstract:
Although it has been confirmed by multiple studies, the inverted-U relationship between stimulus complexity and preference (liking) remains contentious. Research aimed at substantiating the model are largely reliant upon anecdotal self-assessments of subjects and basic measures of complexity, leaving potential confounds unresolved. This study attempts to address the topic by assessing listening time as a behavioral correlate of liking (with the assumption that engagement prolongs listening time) and by looking for latent factors underlying several measures of rhythmic complexity. Participants listened to groups of rhythms, stopping each one when they started to lose interest and were asked to rate each rhythm in each group in terms of interest, complexity, and preference. Subjects were not informed that the time spent listening to each rhythm was the primary measure of interest. The hypothesis that listening time does demonstrate the same inverted-U relationship with complexity as verbal reports of liking was confirmed using a variety of metrics for rhythmic complexity, including meter-dependent measures of syncopation and meter-independent measures of entropy.Keywords: complexity, entropy, rhythm, syncopation
Procedia PDF Downloads 17217242 Particle Swarm Optimisation of a Terminal Synergetic Controllers for a DC-DC Converter
Authors: H. Abderrezek, M. N. Harmas
Abstract:
DC-DC converters are widely used as reliable power source for many industrial and military applications, computers and electronic devices. Several control methods were developed for DC-DC converters control mostly with asymptotic convergence. Synergetic control (SC) is a proven robust control approach and will be used here in a so-called terminal scheme to achieve finite time convergence. Lyapunov synthesis is adopted to assure controlled system stability. Furthermore particle swarm optimization (PSO) algorithm, based on an integral time absolute of error (ITAE) criterion will be used to optimize controller parameters. Simulation of terminal synergetic control of a DC-DC converter is carried out for different operating conditions and results are compared to classic synergetic control performance, that which demonstrate the effectiveness and feasibility of the proposed control method.Keywords: DC-DC converter, PSO, finite time, terminal, synergetic control
Procedia PDF Downloads 50017241 Optical Variability of Faint Quasars
Authors: Kassa Endalamaw Rewnu
Abstract:
The variability properties of a quasar sample, spectroscopically complete to magnitude J = 22.0, are investigated on a time baseline of 2 years using three different photometric bands (U, J and F). The original sample was obtained using a combination of different selection criteria: colors, slitless spectroscopy and variability, based on a time baseline of 1 yr. The main goals of this work are two-fold: first, to derive the percentage of variable quasars on a relatively short time baseline; secondly, to search for new quasar candidates missed by the other selection criteria; and, thus, to estimate the completeness of the spectroscopic sample. In order to achieve these goals, we have extracted all the candidate variable objects from a sample of about 1800 stellar or quasi-stellar objects with limiting magnitude J = 22.50 over an area of about 0.50 deg2. We find that > 65% of all the objects selected as possible variables are either confirmed quasars or quasar candidates on the basis of their colors. This percentage increases even further if we exclude from our lists of variable candidates a number of objects equal to that expected on the basis of `contamination' induced by our photometric errors. The percentage of variable quasars in the spectroscopic sample is also high, reaching about 50%. On the basis of these results, we can estimate that the incompleteness of the original spectroscopic sample is < 12%. We conclude that variability analysis of data with small photometric errors can be successfully used as an efficient and independent (or at least auxiliary) selection method in quasar surveys, even when the time baseline is relatively short. Finally, when corrected for the different intrinsic time lags corresponding to a fixed observed time baseline, our data do not show a statistically significant correlation between variability and either absolute luminosity or redshift.Keywords: nuclear activity, galaxies, active quasars, variability
Procedia PDF Downloads 7817240 Stability Enhancement of a Large-Scale Power System Using Power System Stabilizer Based on Adaptive Neuro Fuzzy Inference System
Authors: Agung Budi Muljono, I Made Ginarsa, I Made Ari Nrartha
Abstract:
A large-scale power system (LSPS) consists of two or more sub-systems connected by inter-connecting transmission. Loading pattern on an LSPS always changes from time to time and varies depend on consumer need. The serious instability problem is appeared in an LSPS due to load fluctuation in all of the bus. Adaptive neuro-fuzzy inference system (ANFIS)-based power system stabilizer (PSS) is presented to cover the stability problem and to enhance the stability of an LSPS. The ANFIS control is presented because the ANFIS control is more effective than Mamdani fuzzy control in the computation aspect. Simulation results show that the presented PSS is able to maintain the stability by decreasing peak overshoot to the value of −2.56 × 10−5 pu for rotor speed deviation Δω2−3. The presented PSS also makes the settling time to achieve at 3.78 s on local mode oscillation. Furthermore, the presented PSS is able to improve the peak overshoot and settling time of Δω3−9 to the value of −0.868 × 10−5 pu and at the time of 3.50 s for inter-area oscillation.Keywords: ANFIS, large-scale, power system, PSS, stability enhancement
Procedia PDF Downloads 30517239 Time and Cost Efficiency Analysis of Quick Die Change System on Metal Stamping Industry
Authors: Rudi Kurniawan Arief
Abstract:
Manufacturing cost and setup time are the hot topics to improve in Metal Stamping industry because material and components price are always rising up while costumer requires to cut down the component price year by year. The Single Minute Exchange of Die (SMED) is one of many methods to reduce waste in stamping industry. The Japanese Quick Die Change (QDC) dies system is one of SMED systems that could reduce both of setup time and manufacturing cost. However, this system is rarely used in stamping industries. This paper will analyze how deep the QDC dies system could reduce setup time and the manufacturing cost. The research is conducted by direct observation, simulating and comparing of QDC dies system with conventional dies system. In this research, we found that the QDC dies system could save up to 35% of manufacturing cost and reduce 70% of setup times. This simulation proved that the QDC die system is effective for cost reduction but must be applied in several parallel production processes.Keywords: press die, metal stamping, QDC system, single minute exchange die, manufacturing cost saving, SMED
Procedia PDF Downloads 16817238 A Proposal for U-City (Smart City) Service Method Using Real-Time Digital Map
Authors: SangWon Han, MuWook Pyeon, Sujung Moon, DaeKyo Seo
Abstract:
Recently, technologies based on three-dimensional (3D) space information are being developed and quality of life is improving as a result. Research on real-time digital map (RDM) is being conducted now to provide 3D space information. RDM is a service that creates and supplies 3D space information in real time based on location/shape detection. Research subjects on RDM include the construction of 3D space information with matching image data, complementing the weaknesses of image acquisition using multi-source data, and data collection methods using big data. Using RDM will be effective for space analysis using 3D space information in a U-City and for other space information utilization technologies.Keywords: RDM, multi-source data, big data, U-City
Procedia PDF Downloads 43217237 Design, Construction, Validation And Use Of A Novel Portable Fire Effluent Sampling Analyser
Authors: Gabrielle Peck, Ryan Hayes
Abstract:
Current large scale fire tests focus on flammability and heat release measurements. Smoke toxicity isn’t considered despite it being a leading cause of death and injury in unwanted fires. A key reason could be that the practical difficulties associated with quantifying individual toxic components present in a fire effluent often require specialist equipment and expertise. Fire effluent contains a mixture of unreactive and reactive gases, water, organic vapours and particulate matter, which interact with each other. This interferes with the operation of the analytical instrumentation and must be removed without changing the concentration of the target analyte. To mitigate the need for expensive equipment and time-consuming analysis, a portable gas analysis system was designed, constructed and tested for use in large-scale fire tests as a simpler and more robust alternative to online FTIR measurements. The novel equipment aimed to be easily portable and able to run on battery or mains electricity; be able to be calibrated at the test site; be capable of quantifying CO, CO2, O2, HCN, HBr, HCl, NOx and SO2 accurately and reliably; be capable of independent data logging; be capable of automated switchover of 7 bubblers; be able to withstand fire effluents; be simple to operate; allow individual bubbler times to be pre-set; be capable of being controlled remotely. To test the analysers functionality, it was used alongside the ISO/TS 19700 Steady State Tube Furnace (SSTF). A series of tests were conducted to assess the validity of the box analyser measurements and the data logging abilities of the apparatus. PMMA and PA 6.6 were used to assess the validity of the box analyser measurements. The data obtained from the bench-scale assessments showed excellent agreement. Following this, the portable analyser was used to monitor gas concentrations during large-scale testing using the ISO 9705 room corner test. The analyser was set up, calibrated and set to record smoke toxicity measurements in the doorway of the test room. The analyser was successful in operating without manual interference and successfully recorded data for 12 of the 12 tests conducted in the ISO room tests. At the end of each test, the analyser created a data file (formatted as .csv) containing the measured gas concentrations throughout the test, which do not require specialist knowledge to interpret. This validated the portable analyser’s ability to monitor fire effluent without operator intervention on both a bench and large-scale. The portable analyser is a validated and significantly more practical alternative to FTIR, proven to work for large-scale fire testing for quantification of smoke toxicity. The analyser is a cheaper, more accessible option to assess smoke toxicity, mitigating the need for expensive equipment and specialist operators.Keywords: smoke toxicity, large-scale tests, iso 9705, analyser, novel equipment
Procedia PDF Downloads 7617236 Path Planning for Collision Detection between two Polyhedra
Authors: M. Khouil, N. Saber, M. Mestari
Abstract:
This study aimed to propose, a different architecture of a Path Planning using the NECMOP. where several nonlinear objective functions must be optimized in a conflicting situation. The ability to detect and avoid collision is very important for mobile intelligent machines. However, many artificial vision systems are not yet able to quickly and cheaply extract the wealth information. This network, which has been particularly reviewed, has enabled us to solve with a new approach the problem of collision detection between two convex polyhedra in a fixed time (O (1) time). We used two types of neurons linear and threshold logic, which simplified the actual implementation of all the networks proposed. This article represents a comprehensive algorithm that determine through the AMAXNET network a measure (a mini-maximum point) in a fixed time, which allows us to detect the presence of a potential collision.Keywords: path planning, collision detection, convex polyhedron, neural network
Procedia PDF Downloads 43817235 Improved Postprandial Response and Feeling of Satiety After Consumption of Sour Cherry Pomace Enriched Muffins
Authors: Joanna Bajerska, Sylwia Mildner-Szkudlarz, Pawel Górnas, Dalija Segliņac
Abstract:
Sour cherry pomace (CP) by-products obtained during fruit processing, was used to replace the wheat flour in muffin formula on the levels 20% (CP20) and 30% (CP30). The sensory profile of this muffins were characterized, and their impact on glycemic response and appetite sensation were studied. Randomized crossover study where test subjects were given either plain muffin (PM) or CP20 or CP30 during 2 different occasions. In the first study test muffins with equivalent of 50 g available carbohydrate were consumed. Blood glucose was measured before and up to 120 min after consuming the test muffins. To study satiety response in the second trial of the test muffins (portion 1700 kJ per serve) were ingested. Sensory analysis was performed earlier by a sensory panel consisting of 10 well-trained individuals. It is acceptable to incorporate CP into a muffin formula at concentrations up to 30%. With the CP muffins treatment, the glucose responses were significantly lower at 30, 45 and 60 min of the intervals also the incremental peak glucose was 0.40 mmol/L and 0.60 mmol/L lower than for PM. CP20 and CP30 also improved satiety as compared to PM. CP can be a good functional ingredient of functional bakery products to assist in managing glucose levels and satiety in healthy individuals.Keywords: muffins, postprandial glucose, sensory analysis, satiety sour cherry pomace
Procedia PDF Downloads 36217234 Real Time Adaptive Obstacle Avoidance in Dynamic Environments with Different D-S
Authors: Mohammad Javad Mollakazemi, Farhad Asadi
Abstract:
In this paper a real-time obstacle avoidance approach for both autonomous and non-autonomous dynamical systems (DS) is presented. In this approach the original dynamics of the controller which allow us to determine safety margin can be modulated. Different common types of DS increase the robot’s reactiveness in the face of uncertainty in the localization of the obstacle especially when robot moves very fast in changeable complex environments. The method is validated by simulation and influence of different autonomous and non-autonomous DS such as important characteristics of limit cycles and unstable DS. Furthermore, the position of different obstacles in complex environment is explained. Finally, the verification of avoidance trajectories is described through different parameters such as safety factor.Keywords: limit cycles, nonlinear dynamical system, real time obstacle avoidance, safety margin
Procedia PDF Downloads 440