Search results for: real time mode
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 22159

Search results for: real time mode

21049 Analyzing Electromagnetic and Geometric Characterization of Building Insulation Materials Using the Transient Radar Method (TRM)

Authors: Ali Pourkazemi

Abstract:

The transient radar method (TRM) is one of the non-destructive methods that was introduced by authors a few years ago. The transient radar method can be classified as a wave-based non destructive testing (NDT) method that can be used in a wide frequency range. Nevertheless, it requires a narrow band, ranging from a few GHz to a few THz, depending on the application. As a time-of-flight and real-time method, TRM can measure the electromagnetic properties of the sample under test not only quickly and accurately, but also blindly. This means that it requires no prior knowledge of the sample under test. For multi-layer structures, TRM is not only able to detect changes related to any parameter within the multi-layer structure but can also measure the electromagnetic properties of each layer and its thickness individually. Although the temperature, humidity, and general environmental conditions may affect the sample under test, they do not affect the accuracy of the Blind TRM algorithm. In this paper, the electromagnetic properties as well as the thickness of the individual building insulation materials - as a single-layer structure - are measured experimentally. Finally, the correlation between the reflection coefficients and some other technical parameters such as sound insulation, thermal resistance, thermal conductivity, compressive strength, and density is investigated. The sample to be studied is 30 cm x 50 cm and the thickness of the samples varies from a few millimeters to 6 centimeters. This experiment is performed with both biostatic and differential hardware at 10 GHz. Since it is a narrow-band system, high-speed computation for analysis, free-space application, and real-time sensor, it has a wide range of potential applications, e.g., in the construction industry, rubber industry, piping industry, wind energy industry, automotive industry, biotechnology, food industry, pharmaceuticals, etc. Detection of metallic, plastic pipes wires, etc. through or behind the walls are specific applications for the construction industry.

Keywords: transient radar method, blind electromagnetic geometrical parameter extraction technique, ultrafast nondestructive multilayer dielectric structure characterization, electronic measurement systems, illumination, data acquisition performance, submillimeter depth resolution, time-dependent reflected electromagnetic signal blind analysis method, EM signal blind analysis method, time domain reflectometer, microwave, milimeter wave frequencies

Procedia PDF Downloads 69
21048 The Applications and Effects of the Career Courses of Taiwanese College Students with LEGO® SERIOUS PLAY®

Authors: Payling Harn

Abstract:

LEGO® SERIOUS PLAY® is a kind of facilitated workshop of thinking and problem-solving approach. Participants built symbolic and metaphorical brick models in response to tasks given by the facilitator and presented these models to other participants. LEGO® SERIOUS PLAY® applied the positive psychological mechanism of Flow and positive emotions to help participants perceiving self-experience and unknown fact and increasing the happiness of life by building bricks and narrating story. At present, LEGO® SERIOUS PLAY® is often utilized for facilitating professional identity and strategy development to assist workers in career development. The researcher desires to apply LEGO® SERIOUS PLAY® to the career courses of college students in order to promote their career ability. This study aimed to use the facilitative method of LEGO® SERIOUS PLAY® to develop the career courses of college students, then explore the effects of Taiwanese college students' positive and negative emotions, career adaptabilities, and career sense of hope by LEGO® SERIOUS PLAY® career courses. The researcher regarded strength as the core concept and use the facilitative mode of LEGO® SERIOUS PLAY® to develop the 8 weeks’ career courses, which including ‘emotion of college life’ ‘career highlights’, ‘career strengths’, ‘professional identity’, ‘business model’, ‘career coping’, ‘strength guiding principles’, ‘career visions’,’ career hope’, etc. The researcher will adopt problem-oriented teaching method to give tasks which according to the weekly theme, use the facilitative mode of LEGO® SERIOUS PLAY® to guide participants to respond tasks by building bricks. Then participants will conduct group discussions, reports, and writing reflection journals weekly. Participants will be 24 second-grade college students. They will attend LEGO® SERIOUS PLAY® career courses for 2 hours a week. The researcher used’ ‘Career Adaptability Scale’ and ‘Career Hope Scale’ to conduct pre-test and post-test. The time points of implementation testing will be one week before courses starting, one day after courses ending respectively. Then the researcher will adopt repeated measures one-way ANOVA for analyzing data. The results revealed that the participants significantly presented immediate positive effect in career adaptability and career hope. The researcher hopes to construct the mode of LEGO® SERIOUS PLAY® career courses by this study and to make a substantial contribution to the future career teaching and researches of LEGO® SERIOUS PLAY®.

Keywords: LEGO® SERIOUS PLAY®, career courses, strength, positive and negative affect, career hope

Procedia PDF Downloads 253
21047 Virtualization of Production Using Digital Twin Technology

Authors: Bohuslava Juhasova, Igor Halenar, Martin Juhas

Abstract:

The contribution deals with the current situation in modern manufacturing enterprises, which is affected by digital virtualization of different parts of the production process. The overview part of this article points to the fact, that wide informatization of all areas causes substitution of real elements and relationships between them with their digital, often virtual images, in real practice. Key characteristics of the systems implemented using digital twin technology along with essential conditions for intelligent products deployment were identified across many published studies. The goal was to propose a template for the production system realization using digital twin technology as a supplement to standardized concepts for Industry 4.0. The main resulting idea leads to the statement that the current trend of implementation of the new technologies and ways of communication between industrial facilities erases the boundaries between the real environment and the virtual world.

Keywords: communication, digital twin, Industry 4.0, simulation, virtualization

Procedia PDF Downloads 248
21046 Medical Imaging Fusion: A Teaching-Learning Simulation Environment

Authors: Cristina Maria Ribeiro Martins Pereira Caridade, Ana Rita Ferreira Morais

Abstract:

The use of computational tools has become essential in the context of interactive learning, especially in engineering education. In the medical industry, teaching medical image processing techniques is a crucial part of training biomedical engineers, as it has integrated applications with healthcare facilities and hospitals. The aim of this article is to present a teaching-learning simulation tool developed in MATLAB using a graphical user interface for medical image fusion that explores different image fusion methodologies and processes in combination with image pre-processing techniques. The application uses different algorithms and medical fusion techniques in real time, allowing you to view original images and fusion images, compare processed and original images, adjust parameters, and save images. The tool proposed in an innovative teaching and learning environment consists of a dynamic and motivating teaching simulation for biomedical engineering students to acquire knowledge about medical image fusion techniques and necessary skills for the training of biomedical engineers. In conclusion, the developed simulation tool provides real-time visualization of the original and fusion images and the possibility to test, evaluate and progress the student’s knowledge about the fusion of medical images. It also facilitates the exploration of medical imaging applications, specifically image fusion, which is critical in the medical industry. Teachers and students can make adjustments and/or create new functions, making the simulation environment adaptable to new techniques and methodologies.

Keywords: image fusion, image processing, teaching-learning simulation tool, biomedical engineering education

Procedia PDF Downloads 131
21045 The Real Consignee: An Exploratory Study of the True Party who is Entitled to Receive Cargo under Bill of Lading

Authors: Mojtaba Eshraghi Arani

Abstract:

According to the international conventions for the carriage of goods by sea, the consignee is the person who is entitled to take delivery of the cargo from the carrier. Such a person is usually named in the relevant box of the bill of lading unless the latter is issued “To Order” or “To Bearer”. However, there are some cases in which the apparent consignee, as above, was not intended to take delivery of cargo, like the L/C issuing bank or the freight forwarder who are named as consignee only for the purpose of security or acceleration of transit process. In such cases as well as the BL which is issued “To Order”, the so-called “real consignee” can be found out in the “Notify Party” box. The dispute revolves around the choice between apparent consignee and real consignee for being entitled not only to take delivery of the cargo but also to sue the carrier for any damages or loss. While it is a generally accepted rule that only the apparent consignee shall be vested with such rights, some courts like France’s Cour de Cassation have declared that the “Notify Party”, as the real consignee, was entitled to sue the carrier and in some cases, the same court went far beyond and permitted the real consignee to take suit even where he was not mentioned on the BL as a “Notify Party”. The main argument behind such reasoning is that the real consignee is the person who suffered the loss and thus had a legitimate interest in bringing action; of course, the real consignee must prove that he incurred a loss. It is undeniable that the above-mentioned approach is contrary to the position of the international conventions on the express definition of consignee. However, international practice has permitted the use of BL in a different way to meet the business requirements of banks, freight forwarders, etc. Thus, the issue is one of striking a balance between the international conventions on the one hand and existing practices on the other hand. While the latest convention applicable for sea transportation, i.e., the Rotterdam Rules, dealt with the comparable issue of “shipper” and “documentary shipper”, it failed to cope with the matter being discussed. So a new study is required to propose the best solution for amending the current conventions for carriage of goods by sea. A qualitative method with the concept of interpretation of data collection has been used in this article. The source of the data is the analysis of domestic and international regulations and cases. It is argued in this manuscript that the judge is not allowed to recognize any one as real consignee, other than the person who is mentioned in the “Consingee” box unless the BL is issued “To Order” or “To Bearer”. Moreover, the contract of carriage is independent of the sale contract and thus, the consignee must be determined solely based on the facts of the BL itself, like “Notify Party” and not any other contract or document.

Keywords: real consignee, cargo, delivery, to order, notify the party

Procedia PDF Downloads 79
21044 Comparative Study of Skeletonization and Radial Distance Methods for Automated Finger Enumeration

Authors: Mohammad Hossain Mohammadi, Saif Al Ameri, Sana Ziaei, Jinane Mounsef

Abstract:

Automated enumeration of the number of hand fingers is widely used in several motion gaming and distance control applications, and is discussed in several published papers as a starting block for hand recognition systems. The automated finger enumeration technique should not only be accurate, but also must have a fast response for a moving-picture input. The high performance of video in motion games or distance control will inhibit the program’s overall speed, for image processing software such as Matlab need to produce results at high computation speeds. Since an automated finger enumeration with minimum error and processing time is desired, a comparative study between two finger enumeration techniques is presented and analyzed in this paper. In the pre-processing stage, various image processing functions were applied on a real-time video input to obtain the final cleaned auto-cropped image of the hand to be used for the two techniques. The first technique uses the known morphological tool of skeletonization to count the number of skeleton’s endpoints for fingers. The second technique uses a radial distance method to enumerate the number of fingers in order to obtain a one dimensional hand representation. For both discussed methods, the different steps of the algorithms are explained. Then, a comparative study analyzes the accuracy and speed of both techniques. Through experimental testing in different background conditions, it was observed that the radial distance method was more accurate and responsive to a real-time video input compared to the skeletonization method. All test results were generated in Matlab and were based on displaying a human hand for three different orientations on top of a plain color background. Finally, the limitations surrounding the enumeration techniques are presented.

Keywords: comparative study, hand recognition, fingertip detection, skeletonization, radial distance, Matlab

Procedia PDF Downloads 382
21043 Effects of Cold Treatments on Methylation Profiles and Reproduction Mode of Diploid and Tetraploid Plants of Ranunculus kuepferi (Ranunculaceae)

Authors: E. Syngelaki, C. C. F. Schinkel, S. Klatt, E. Hörandl

Abstract:

Environmental influence can alter the conditions for plant development and can trigger changes in epigenetic variation. Thus, the exposure to abiotic environmental stress can lead to different DNA methylation profiles and may have evolutionary consequences for adaptation. Epigenetic control mechanisms may further influence mode of reproduction. The alpine species R. kuepferi has diploid and tetraploid cytotypes, that are mostly sexual and facultative apomicts, respectively. Hence, it is a suitable model system for studying the correlations of mode of reproduction, ploidy, and environmental stress. Diploid and tetraploid individuals were placed in two climate chambers and treated with low (+7°C day/+2°C night, -1°C cold shocks for three nights per week) and warm (control) temperatures (+15°C day/+10°C night). Subsequently, methylation sensitive-Amplified Fragment-Length Polymorphism (AFPL) markers were used to screen genome-wide methylation alterations triggered by stress treatments. The dataset was analyzed for four groups regarding treatment (cold/warm) and ploidy level (diploid/tetraploid), and also separately for full methylated, hemi-methylated and unmethylated sites. Patterns of epigenetic variation suggested that diploids differed significantly in their profiles from tetraploids independent from treatment, while treatments did not differ significantly within cytotypes. Furthermore, diploids are more differentiated than the tetraploids in overall methylation profiles of both treatments. This observation is in accordance with the increased frequency of apomictic seed formation in diploids and maintenance of facultative apomixis in tetraploids during the experiment. Global analysis of molecular variance showed higher epigenetic variation within groups than among them, while locus-by-locus analysis of molecular variance showed a high number (54.7%) of significantly differentiated un-methylated loci. To summarise, epigenetic variation seems to depend on ploidy level, and in diploids may be correlated to changes in mode of reproduction. However, further studies are needed to elucidate the mechanism and possible functional significance of these correlations.

Keywords: apomixis, cold stress, DNA methylation, Ranunculus kuepferi

Procedia PDF Downloads 160
21042 Electrokinetic Regulation of Flow in Microcrack Reservoirs

Authors: Aslanova Aida Ramiz

Abstract:

One of the important aspects of rheophysical problems in oil and gas extraction is the regulation of thermohydrodynamic properties of liquid systems using physical and physicochemical methods. It is known that the constituent parts of real fluid systems in oil and gas production are practically non-conducting, non-magnetically active components. Real heterogeneous hydrocarbon systems, from the structural point of view, consist of an infinite number of microscopic local ion-electrostatic cores distributed in the volume of the dispersion medium. According to Cohen's rule, double electric layers are formed at the contact boundaries of components in contact (oil-gas, oil-water, water-condensate, etc.) in a heterogeneous system, and as a result, each real fluid system can be represented as a complex composition of a set of local electrostatic fields. The electrokinetic properties of this structure are characterized by a certain electrode potential. Prof. F.H. Valiyev called this potential the α-factor and came up with the idea that many natural and technological rheophysical processes (effects) are essentially electrokinetic in nature, and by changing the α-factor, it is possible to adjust the physical properties of real hydraulic systems, including thermohydrodynamic parameters. Based on this idea, extensive research work was conducted, and the possibility of reducing hydraulic resistances and improving rheological properties was experimentally discovered in real liquid systems by reducing the electrical potential with various physical and chemical methods.

Keywords: microcracked, electrode potential, hydraulic resistance, Newtonian fluid, rheophysical properties

Procedia PDF Downloads 77
21041 Real Estate Trend Prediction with Artificial Intelligence Techniques

Authors: Sophia Liang Zhou

Abstract:

For investors, businesses, consumers, and governments, an accurate assessment of future housing prices is crucial to critical decisions in resource allocation, policy formation, and investment strategies. Previous studies are contradictory about macroeconomic determinants of housing price and largely focused on one or two areas using point prediction. This study aims to develop data-driven models to accurately predict future housing market trends in different markets. This work studied five different metropolitan areas representing different market trends and compared three-time lagging situations: no lag, 6-month lag, and 12-month lag. Linear regression (LR), random forest (RF), and artificial neural network (ANN) were employed to model the real estate price using datasets with S&P/Case-Shiller home price index and 12 demographic and macroeconomic features, such as gross domestic product (GDP), resident population, personal income, etc. in five metropolitan areas: Boston, Dallas, New York, Chicago, and San Francisco. The data from March 2005 to December 2018 were collected from the Federal Reserve Bank, FBI, and Freddie Mac. In the original data, some factors are monthly, some quarterly, and some yearly. Thus, two methods to compensate missing values, backfill or interpolation, were compared. The models were evaluated by accuracy, mean absolute error, and root mean square error. The LR and ANN models outperformed the RF model due to RF’s inherent limitations. Both ANN and LR methods generated predictive models with high accuracy ( > 95%). It was found that personal income, GDP, population, and measures of debt consistently appeared as the most important factors. It also showed that technique to compensate missing values in the dataset and implementation of time lag can have a significant influence on the model performance and require further investigation. The best performing models varied for each area, but the backfilled 12-month lag LR models and the interpolated no lag ANN models showed the best stable performance overall, with accuracies > 95% for each city. This study reveals the influence of input variables in different markets. It also provides evidence to support future studies to identify the optimal time lag and data imputing methods for establishing accurate predictive models.

Keywords: linear regression, random forest, artificial neural network, real estate price prediction

Procedia PDF Downloads 103
21040 Neural Nets Based Approach for 2-Cells Power Converter Control

Authors: Kamel Laidi, Khelifa Benmansour, Ouahid Bouchhida

Abstract:

Neural networks-based approach for 2-cells serial converter has been developed and implemented. The approach is based on a behavioural description of the different operating modes of the converter. Each operating mode represents a well-defined configuration, and for which is matched an operating zone satisfying given invariance conditions, depending on the capacitors' voltages and the load current of the converter. For each mode, a control vector whose components are the control signals to be applied to the converter switches has been associated. Therefore, the problem is reduced to a classification task of the different operating modes of the converter. The artificial neural nets-based approach, which constitutes a powerful tool for this kind of task, has been adopted and implemented. The application to a 2-cells chopper has allowed ensuring efficient and robust control of the load current and a high capacitors voltages balancing.

Keywords: neural nets, control, multicellular converters, 2-cells chopper

Procedia PDF Downloads 834
21039 Virtual Process Hazard Analysis (Pha) Of a Nuclear Power Plant (Npp) Using Failure Mode and Effects Analysis (Fmea) Technique

Authors: Lormaine Anne A. Branzuela, Elysa V. Largo, Monet Concepcion M. Detras, Neil C. Concibido

Abstract:

The electricity demand is still increasing, and currently, the Philippine government is investigating the feasibility of operating the Bataan Nuclear Power Plant (BNPP) to address the country’s energy problem. However, the lack of process safety studies on BNPP focused on the effects of hazardous substances on the integrity of the structure, equipment, and other components, have made the plant operationalization questionable to the public. The three major nuclear power plant incidents – TMI-2, Chernobyl, and Fukushima – have made many people hesitant to include nuclear energy in the energy matrix. This study focused on the safety evaluation of possible operations of a nuclear power plant installed with a Pressurized Water Reactor (PWR), which is similar to BNPP. Failure Mode and Effects Analysis (FMEA) is one of the Process Hazard Analysis (PHA) techniques used for the identification of equipment failure modes and minimizing its consequences. Using the FMEA technique, this study was able to recognize 116 different failure modes in total. Upon computation and ranking of the risk priority number (RPN) and criticality rating (CR), it showed that failure of the reactor coolant pump due to earthquakes is the most critical failure mode. This hazard scenario could lead to a nuclear meltdown and radioactive release, as identified by the FMEA team. Safeguards and recommended risk reduction strategies to lower the RPN and CR were identified such that the effects are minimized, the likelihood of occurrence is reduced, and failure detection is improved.

Keywords: PHA, FMEA, nuclear power plant, bataan nuclear power plant

Procedia PDF Downloads 131
21038 Enhancing Signal Reception in a Mobile Radio Network Using Adaptive Beamforming Antenna Arrays Technology

Authors: Ugwu O. C., Mamah R. O., Awudu W. S.

Abstract:

This work is aimed at enhancing signal reception on a mobile radio network and minimizing outage probability in a mobile radio network using adaptive beamforming antenna arrays. In this research work, an empirical real-time drive measurement was done in a cellular network of Globalcom Nigeria Limited located at Ikeja, the headquarters of Lagos State, Nigeria, with reference base station number KJA 004. The empirical measurement includes Received Signal Strength and Bit Error Rate which were recorded for exact prediction of the signal strength of the network as at the time of carrying out this research work. The Received Signal Strength and Bit Error Rate were measured with a spectrum monitoring Van with the help of a Ray Tracer at an interval of 100 meters up to 700 meters from the transmitting base station. The distance and angular location measurements from the reference network were done with the help Global Positioning System (GPS). The other equipment used were transmitting equipment measurements software (Temsoftware), Laptops and log files, which showed received signal strength with distance from the base station. Results obtained were about 11% from the real-time experiment, which showed that mobile radio networks are prone to signal failure and can be minimized using an Adaptive Beamforming Antenna Array in terms of a significant reduction in Bit Error Rate, which implies improved performance of the mobile radio network. In addition, this work did not only include experiments done through empirical measurement but also enhanced mathematical models that were developed and implemented as a reference model for accurate prediction. The proposed signal models were based on the analysis of continuous time and discrete space, and some other assumptions. These developed (proposed) enhanced models were validated using MATLAB (version 7.6.3.35) program and compared with the conventional antenna for accuracy. These outage models were used to manage the blocked call experience in the mobile radio network. 20% improvement was obtained when the adaptive beamforming antenna arrays were implemented on the wireless mobile radio network.

Keywords: beamforming algorithm, adaptive beamforming, simulink, reception

Procedia PDF Downloads 41
21037 Application of Building Information Modelling In Analysing IGBC® Ratings (Sustainability Analyses)

Authors: Lokesh Harshe

Abstract:

The building construction sector is using 36% of global energy consumption with 39% of CO₂ emission. Professionals in the Built Environment Sector have long been aware of the industry’s contribution towards CO₂ emissions and are now moving towards more sustainable practices. As a result of this, many organizations have introduced rating systems to address the issue of global warming in the construction sector by ranking construction projects based on sustainability parameters. The pre-construction phase of any building project is the most essential time to make decisions for addressing the sustainability aspects. Traditionally, it is very difficult to collect data from different stakeholders and bring it together to form a decision based on factual data to perform sustainability analyses in the pre-construction phase. Building Information Modelling (BIM) is the solution where one single model is the result of the collaborative approach of BIM processes where all the information is shared, extracted, communicated, and stored on a single platform that everyone can access and make decisions based on real-time data. The focus of this research is on the Indian Green Rating System IGBC® with the objective of understanding IGBC® requirements and developing a framework to create the relationship between the rating processes and BIM. A Hypothetical (Architectural) model of a hostel building is developed using AutoCAD 2019 & Revit Arch. 2019, where the framework is applied to generate results on sustainability analysis using Green Building Studio (GBS) and Revit Add-ins. The results of any sustainability analysis are generated within a fraction of a minute, which is very quick in comparison with traditional sustainability analysis. This may save a considerable amount of time as well as cost. The future scope is to integrate Architectural, Structural, and MEP Models to perform accurate sustainability analyses with inputs from industry professionals working on real-life Green BIM projects.

Keywords: sustainability analyses, BIM, green rating systems, IGBC®, LEED

Procedia PDF Downloads 54
21036 Object Recognition System Operating from Different Type Vehicles Using Raspberry and OpenCV

Authors: Maria Pavlova

Abstract:

In our days, it is possible to put the camera on different vehicles like quadcopter, train, airplane and etc. The camera also can be the input sensor in many different systems. That means the object recognition like non separate part of monitoring control can be key part of the most intelligent systems. The aim of this paper is to focus of the object recognition process during vehicles movement. During the vehicle’s movement the camera takes pictures from the environment without storage in Data Base. In case the camera detects a special object (for example human or animal), the system saves the picture and sends it to the work station in real time. This functionality will be very useful in emergency or security situations where is necessary to find a specific object. In another application, the camera can be mounted on crossroad where do not have many people and if one or more persons come on the road, the traffic lights became the green and they can cross the road. In this papers is presented the system has solved the aforementioned problems. It is presented architecture of the object recognition system includes the camera, Raspberry platform, GPS system, neural network, software and Data Base. The camera in the system takes the pictures. The object recognition is done in real time using the OpenCV library and Raspberry microcontroller. An additional feature of this library is the ability to display the GPS coordinates of the captured objects position. The results from this processes will be sent to remote station. So, in this case, we can know the location of the specific object. By neural network, we can learn the module to solve the problems using incoming data and to be part in bigger intelligent system. The present paper focuses on the design and integration of the image recognition like a part of smart systems.

Keywords: camera, object recognition, OpenCV, Raspberry

Procedia PDF Downloads 218
21035 The Effect of Opening on Mode Shapes and Frequencies of Composite Shear Wall

Authors: A. Arabzadeh, H. R. Kazemi Nia Korrani

Abstract:

Composite steel plate shear wall is a lateral loading resistance system, which is used especially in tall buildings. This wall is made of a thin steel plate with reinforced a concrete cover, which is attached to one or both sides of the steel plate. This system is similar to stiffened steel plate shear wall, in which reinforced concrete replaces the steel stiffeners. Composite shear wall have in-plane and out-plane significant strength. Also, they have appropriate ductility. The present numerical investigations were focused on the effects of opening on wall mode shapes. In addition, frequencies of composite shear wall with and without opening are compared. For analyzing composite shear wall, a new program will be developed using of finite element theory and the effects of shape, size and position openings on the behavior of composite shear wall will be studied. Results indicated that the existence of opening decreases wall frequency.

Keywords: composite shear wall, opening, finite element method, modal analysis

Procedia PDF Downloads 540
21034 Quality-Of-Service-Aware Green Bandwidth Allocation in Ethernet Passive Optical Network

Authors: Tzu-Yang Lin, Chuan-Ching Sue

Abstract:

Sleep mechanisms are commonly used to ensure the energy efficiency of each optical network unit (ONU) that concerns a single class delay constraint in the Ethernet Passive Optical Network (EPON). How long the ONUs can sleep without violating the delay constraint has become a research problem. Particularly, we can derive an analytical model to determine the optimal sleep time of ONUs in every cycle without violating the maximum class delay constraint. The bandwidth allocation considering such optimal sleep time is called Green Bandwidth Allocation (GBA). Although the GBA mechanism guarantees that the different class delay constraints do not violate the maximum class delay constraint, packets with a more relaxed delay constraint will be treated as those with the most stringent delay constraint and may be sent early. This means that the ONU will waste energy in active mode to send packets in advance which did not need to be sent at the current time. Accordingly, we proposed a QoS-aware GBA using a novel intra-ONU scheduling to control the packets to be sent according to their respective delay constraints, thereby enhancing energy efficiency without deteriorating delay performance. If packets are not explicitly classified but with different packet delay constraints, we can modify the intra-ONU scheduling to classify packets according to their packet delay constraints rather than their classes. Moreover, we propose the switchable ONU architecture in which the ONU can switch the architecture according to the sleep time length, thus improving energy efficiency in the QoS-aware GBA. The simulation results show that the QoS-aware GBA ensures that packets in different classes or with different delay constraints do not violate their respective delay constraints and consume less power than the original GBA.

Keywords: Passive Optical Networks, PONs, Optical Network Unit, ONU, energy efficiency, delay constraint

Procedia PDF Downloads 284
21033 The Impacts of Digital Marketing Activities on Customers' Purchase Intention via Brand Reputation and Awareness: Empirical Study

Authors: Radwan Al Dwairi, Sara Melhem

Abstract:

Today’s billions of individuals are linked together in real-time using different types of social platforms. Despite the increasing importance of social media marketing activities in enhancing customers’ intention to purchase online; still, the majority of research has concentrated on the impact of such tools on customer satisfaction or retention and neglecting its real role in enhancing brand reputation and awareness, which in turn impact customers’ intention to purchase online. In response, this study aims to close this gap by conducting an empirical study using a qualitative approach by collecting a sample of data from 216 respondents in this domain. Results of the study reveal the significant impact of word-of-mouth, interactions, and influencers on a brand reputation, where the latter positively and significantly impacted customers’ intention to purchase via social platforms. In addition, results show the significant impact of brand reputation on enhancing customers' purchase intention.

Keywords: brand awareness, brand reputation, EWOM, influencers, interaction

Procedia PDF Downloads 95
21032 Drive Sharing with Multimodal Interaction: Enhancing Safety and Efficiency

Authors: Sagar Jitendra Mahendrakar

Abstract:

Exploratory testing is a dynamic and adaptable method of software quality assurance that is frequently praised for its ability to find hidden flaws and improve the overall quality of the product. Instead of using preset test cases, exploratory testing allows testers to explore the software application dynamically. This is in contrast to scripted testing methodologies, which primarily rely on tester intuition, creativity, and adaptability. There are several tools and techniques that can aid testers in the exploratory testing process which we will be discussing in this talk.Tests of this kind are able to find bugs of this kind that are harder to find during structured testing or that other testing methods may have overlooked.The purpose of this abstract is to examine the nature and importance of exploratory testing in modern software development methods. It explores the fundamental ideas of exploratory testing, highlighting the value of domain knowledge and tester experience in spotting possible problems that may escape the notice of traditional testing methodologies. Throughout the software development lifecycle, exploratory testing promotes quick feedback loops and continuous improvement by giving testers the ability to make decisions in real time based on their observations. This abstract also clarifies the unique features of exploratory testing, like its non-linearity and capacity to replicate user behavior in real-world settings. Testers can find intricate bugs, usability problems, and edge cases in software through impromptu exploration that might go undetected. Exploratory testing's flexible and iterative structure fits in well with agile and DevOps processes, allowing for a quicker time to market without sacrificing the quality of the final product.

Keywords: exploratory, testing, automation, quality

Procedia PDF Downloads 51
21031 Price Compensation Mechanism with Unmet Demand for Public-Private Partnership Projects

Authors: Zhuo Feng, Ying Gao

Abstract:

Public-private partnership (PPP), as an innovative way to provide infrastructures by the private sector, is being widely used throughout the world. Compared with the traditional mode, PPP emerges largely for merits of relieving public budget constraint and improving infrastructure supply efficiency by involving private funds. However, PPP projects are characterized by large scale, high investment, long payback period, and long concession period. These characteristics make PPP projects full of risks. One of the most important risks faced by the private sector is demand risk because many factors affect the real demand. If the real demand is far lower than the forecasting demand, the private sector will be got into big trouble because operating revenue is the main means for the private sector to recoup the investment and obtain profit. Therefore, it is important to study how the government compensates the private sector when the demand risk occurs in order to achieve Pareto-improvement. This research focuses on price compensation mechanism, an ex-post compensation mechanism, and analyzes, by mathematical modeling, the impact of price compensation mechanism on payoff of the private sector and consumer surplus for PPP toll road projects. This research first investigates whether or not price compensation mechanisms can obtain Pareto-improvement and, if so, then explores boundary conditions for this mechanism. The research results show that price compensation mechanism can realize Pareto-improvement under certain conditions. Especially, to make the price compensation mechanism accomplish Pareto-improvement, renegotiation costs of the government and the private sector should be lower than a certain threshold which is determined by marginal operating cost and distortionary cost of the tax. In addition, the compensation percentage should match with the price cut of the private investor when demand drops. This research aims to provide theoretical support for the government when determining compensation scope under the price compensation mechanism. Moreover, some policy implications can also be drawn from the analysis for better risk-sharing and sustainability of PPP projects.

Keywords: infrastructure, price compensation mechanism, public-private partnership, renegotiation

Procedia PDF Downloads 179
21030 BIM Modeling of Site and Existing Buildings: Case Study of ESTP Paris Campus

Authors: Rita Sassine, Yassine Hassani, Mohamad Al Omari, Stéphanie Guibert

Abstract:

Building Information Modelling (BIM) is the process of creating, managing, and centralizing information during the building lifecycle. BIM can be used all over a construction project, from the initiation phase to the planning and execution phases to the maintenance and lifecycle management phase. For existing buildings, BIM can be used for specific applications such as lifecycle management. However, most of the existing buildings don’t have a BIM model. Creating a compatible BIM for existing buildings is very challenging. It requires special equipment for data capturing and efforts to convert these data into a BIM model. The main difficulties for such projects are to define the data needed, the level of development (LOD), and the methodology to be adopted. In addition to managing information for an existing building, studying the impact of the built environment is a challenging topic. So, integrating the existing terrain that surrounds buildings into the digital model is essential to be able to make several simulations as flood simulation, energy simulation, etc. Making a replication of the physical model and updating its information in real-time to make its Digital Twin (DT) is very important. The Digital Terrain Model (DTM) represents the ground surface of the terrain by a set of discrete points with unique height values over 2D points based on reference surface (e.g., mean sea level, geoid, and ellipsoid). In addition, information related to the type of pavement materials, types of vegetation and heights and damaged surfaces can be integrated. Our aim in this study is to define the methodology to be used in order to provide a 3D BIM model for the site and the existing building based on the case study of “Ecole Spéciale des Travaux Publiques (ESTP Paris)” school of engineering campus. The property is located on a hilly site of 5 hectares and is composed of more than 20 buildings with a total area of 32 000 square meters and a height between 50 and 68 meters. In this work, the campus precise levelling grid according to the NGF-IGN69 altimetric system and the grid control points are computed according to (Réseau Gédésique Français) RGF93 – Lambert 93 french system with different methods: (i) Land topographic surveying methods using robotic total station, (ii) GNSS (Global Network Satellite sytem) levelling grid with NRTK (Network Real Time Kinematic) mode, (iii) Point clouds generated by laser scanning. These technologies allow the computation of multiple building parameters such as boundary limits, the number of floors, the floors georeferencing, the georeferencing of the 4 base corners of each building, etc. Once the entry data are identified, the digital model of each building is done. The DTM is also modeled. The process of altimetric determination is complex and requires efforts in order to collect and analyze multiple data formats. Since many technologies can be used to produce digital models, different file formats such as DraWinG (DWG), LASer (LAS), Comma-separated values (CSV), Industry Foundation Classes (IFC) and ReViT (RVT) will be generated. Checking the interoperability between BIM models is very important. In this work, all models are linked together and shared on 3DEXPERIENCE collaborative platform.

Keywords: building information modeling, digital terrain model, existing buildings, interoperability

Procedia PDF Downloads 112
21029 The Design of a Computer Simulator to Emulate Pathology Laboratories: A Model for Optimising Clinical Workflows

Authors: M. Patterson, R. Bond, K. Cowan, M. Mulvenna, C. Reid, F. McMahon, P. McGowan, H. Cormican

Abstract:

This paper outlines the design of a simulator to allow for the optimisation of clinical workflows through a pathology laboratory and to improve the laboratory’s efficiency in the processing, testing, and analysis of specimens. Often pathologists have difficulty in pinpointing and anticipating issues in the clinical workflow until tests are running late or in error. It can be difficult to pinpoint the cause and even more difficult to predict any issues which may arise. For example, they often have no indication of how many samples are going to be delivered to the laboratory that day or at a given hour. If we could model scenarios using past information and known variables, it would be possible for pathology laboratories to initiate resource preparations, e.g. the printing of specimen labels or to activate a sufficient number of technicians. This would expedite the clinical workload, clinical processes and improve the overall efficiency of the laboratory. The simulator design visualises the workflow of the laboratory, i.e. the clinical tests being ordered, the specimens arriving, current tests being performed, results being validated and reports being issued. The simulator depicts the movement of specimens through this process, as well as the number of specimens at each stage. This movement is visualised using an animated flow diagram that is updated in real time. A traffic light colour-coding system will be used to indicate the level of flow through each stage (green for normal flow, orange for slow flow, and red for critical flow). This would allow pathologists to clearly see where there are issues and bottlenecks in the process. Graphs would also be used to indicate the status of specimens at each stage of the process. For example, a graph could show the percentage of specimen tests that are on time, potentially late, running late and in error. Clicking on potentially late samples will display more detailed information about those samples, the tests that still need to be performed on them and their urgency level. This would allow any issues to be resolved quickly. In the case of potentially late samples, this could help to ensure that critically needed results are delivered on time. The simulator will be created as a single-page web application. Various web technologies will be used to create the flow diagram showing the workflow of the laboratory. JavaScript will be used to program the logic, animate the movement of samples through each of the stages and to generate the status graphs in real time. This live information will be extracted from an Oracle database. As well as being used in a real laboratory situation, the simulator could also be used for training purposes. ‘Bots’ would be used to control the flow of specimens through each step of the process. Like existing software agents technology, these bots would be configurable in order to simulate different situations, which may arise in a laboratory such as an emerging epidemic. The bots could then be turned on and off to allow trainees to complete the tasks required at that step of the process, for example validating test results.

Keywords: laboratory-process, optimization, pathology, computer simulation, workflow

Procedia PDF Downloads 286
21028 Experimental and Theoratical Methods to Increase Core Damping for Sandwitch Cantilever Beam

Authors: Iyd Eqqab Maree, Moouyad Ibrahim Abbood

Abstract:

The purpose behind this study is to predict damping effect for steel cantilever beam by using two methods of passive viscoelastic constrained layer damping. First method is Matlab Program, this method depend on the Ross, Kerwin and Unger (RKU) model for passive viscoelastic damping. Second method is experimental lab (frequency domain method), in this method used the half-power bandwidth method and can be used to determine the system loss factors for damped steel cantilever beam. The RKU method has been applied to a cantilever beam because beam is a major part of a structure and this prediction may further leads to utilize for different kinds of structural application according to design requirements in many industries. In this method of damping a simple cantilever beam is treated by making sandwich structure to make the beam damp, and this is usually done by using viscoelastic material as a core to ensure the damping effect. The use of viscoelastic layers constrained between elastic layers is known to be effective for damping of flexural vibrations of structures over a wide range of frequencies. The energy dissipated in these arrangements is due to shear deformation in the viscoelastic layers, which occurs due to flexural vibration of the structures. The theory of dynamic stability of elastic systems deals with the study of vibrations induced by pulsating loads that are parametric with respect to certain forms of deformation. There is a very good agreement of the experimental results with the theoretical findings. The main ideas of this thesis are to find the transition region for damped steel cantilever beam (4mm and 8mm thickness) from experimental lab and theoretical prediction (Matlab R2011a). Experimentally and theoretically proved that the transition region for two specimens occurs at modal frequency between mode 1 and mode 2, which give the best damping, maximum loss factor and maximum damping ratio, thus this type of viscoelastic material core (3M468) is very appropriate to use in automotive industry and in any mechanical application has modal frequency eventuate between mode 1 and mode 2.

Keywords: 3M-468 material core, loss factor and frequency, domain method, bioinformatics, biomedicine, MATLAB

Procedia PDF Downloads 271
21027 The Effect of Manure Loaded Biochar on Soil Microbial Communities

Authors: T. Weber, D. MacKenzie

Abstract:

The script in this paper describes the use of advanced simulation environment using electronic systems (microcontroller, operational amplifiers, and FPGA). The simulation was used for non-linear dynamic systems behaviour with required observer structure working with parallel real-time simulation based on state-space representation. The proposed deposited model was used for electrodynamic effects including ionising effects and eddy current distribution also. With the script and proposed method, it is possible to calculate the spatial distribution of the electromagnetic fields in real-time and such systems. For further purpose, the spatial temperature distribution may also be used. With upon system, the uncertainties and disturbances may be determined. This provides the estimation of the more precise system states for the required system and additionally the estimation of the ionising disturbances that arise due to radiation effects in space systems. The results have also shown that a system can be developed specifically with the real-time calculation (estimation) of the radiation effects only. Electronic systems can take damage caused by impacts with charged particle flux in space or radiation environment. TID (Total Ionising Dose) of 1 Gy and Single Effect Transient (SET) free operation up to 50 MeVcm²/mg may assure certain functions. Single-Event Latch-up (SEL) results on the placement of several transistors in the shared substrate of an integrated circuit; ionising radiation can activate an additional parasitic thyristor. This short circuit between semiconductor-elements can destroy the device without protection and measurements. Single-Event Burnout (SEB) on the other hand, increases current between drain and source of a MOSFET and destroys the component in a short time. A Single-Event Gate Rupture (SEGR) can destroy a dielectric of semiconductor also. In order to be able to react to these processes, it must be calculated within a shorter time that ionizing radiation and dose is present. For this purpose, sensors may be used for the realistic evaluation of the diffusion and ionizing effects of the test system. For this purpose, the Peltier element is used for the evaluation of the dynamic temperature increases (dT/dt), from which a measure of the ionization processes and thus radiation will be detected. In addition, the piezo element may be used to record highly dynamic vibrations and oscillations to absorb impacts of charged particle flux. All available sensors shall be used to calibrate the spatial distributions also. By measured value of size and known location of the sensors, the entire distribution in space can be calculated retroactively or more accurately. With the formation, the type of ionisation and the direct effect to the systems and thus possible prevent processes can be activated up to the shutdown. The results show possibilities to perform more qualitative and faster simulations independent of space-systems and radiation environment also. The paper gives additionally an overview of the diffusion effects and their mechanisms.

Keywords: cattle, biochar, manure, microbial activity

Procedia PDF Downloads 103
21026 The Impact of Foreign Direct Investment on Economic Growth of Ethiopia: Econometrics Cointegration Analysis

Authors: Dejene Gizaw Kidane

Abstract:

This study examines the impact of foreign direct investment on economic growth of Ethiopia using yearly time-series data for 1974 through 2013. Economic growth is proxies by real per capita gross domestic product and foreign direct investment proxies by the inflow of foreign direct investment. Other control variables such as gross domestic saving, trade, government consumption and inflation has been incorporated. In order to fully account for feedbacks, a vector autoregressive model is utilized. The results show that there is a stable, long-run relationship between foreign direct investment and economic growth. The variance decomposition results show that the main sources of Ethiopia economic growth variations are due largely own shocks. The pairwise Granger causality results show that there is a unidirectional causality that runs from FDI to economic growth of Ethiopia. Hence, the researcher therefore recommends that, FDI facilitate economic growth, so the government has to exert much effort in order to attract more FDI into the country.

Keywords: real per capita GDP, FDI, co-integration, VECM, Granger causality

Procedia PDF Downloads 436
21025 Ultrastrong Coupling of CdZnS/ZnS Quantum Dots and Breathing Plasmons in Aluminum Metal-Insulator-Metal Nanocavities in Near-Ultraviolet Spectrum

Authors: Li Li, Lei Wang, Chenglin Du, Mengxin Ren, Xinzheng Zhang, Wei Cai, Jingjun Xu

Abstract:

Strong coupling between excitons of quantum dots and plasmons in nanocavites can be realized at room temperature due to the strong confinement of the plasmon fields, which offers building blocks for quantum information systems or ultralow-power switches and lasers. In this work, by using cathodoluminescence, ultrastrong coupling with Rabi splitting above 1 eV between breathing plasmons in Aluminum metal-insulator-metal (MIM) cavity and excited state of CdZnS/ZnS quantum dots was reported in near-UV spectrum. Analytic analysis and full-wave electromagnetic simulations provide the evidence for the strong coupling and confirm the hybridization of the QDs exciton and LSP breathing mode. This study opens the way for new emerging applications based on strongly coupled light-matter states all over the visible region down to ultra-violet frequencies.

Keywords: breathing mode, plasmonics, quantum dot, strong coupling, ultraviolet

Procedia PDF Downloads 199
21024 A Study on Design for Parallel Test Based on Embedded System

Authors: Zheng Sun, Weiwei Cui, Xiaodong Ma, Hongxin Jin, Dongpao Hong, Jinsong Yang, Jingyi Sun

Abstract:

With the improvement of the performance and complexity of modern equipment, automatic test system (ATS) becomes widely used for condition monitoring and fault diagnosis. However, the conventional ATS mainly works in a serial mode, and lacks the ability of testing several equipments at the same time. That leads to low test efficiency and ATS redundancy. Especially for a large majority of equipment under test, the conventional ATS cannot meet the requirement of efficient testing. To reduce the support resource and increase test efficiency, we propose a method of design for the parallel test based on the embedded system in this paper. Firstly, we put forward the general framework of the parallel test system, and the system contains a central management system (CMS) and several distributed test subsystems (DTS). Then we give a detailed design of the system. For the hardware of the system, we use embedded architecture to design DTS. For the software of the system, we use test program set to improve the test adaption. By deploying the parallel test system, the time to test five devices is now equal to the time to test one device in the past. Compared with the conventional test system, the proposed test system reduces the size and improves testing efficiency. This is of great significance for equipment to be put into operation swiftly. Finally, we take an industrial control system as an example to verify the effectiveness of the proposed method. The result shows that the method is reasonable, and the efficiency is improved up to 500%.

Keywords: parallel test, embedded system, automatic test system, automatic test system (ATS), central management system, central management system (CMS), distributed test subsystems, distributed test subsystems (DTS)

Procedia PDF Downloads 305
21023 A Hybrid Algorithm for Collaborative Transportation Planning among Carriers

Authors: Elham Jelodari Mamaghani, Christian Prins, Haoxun Chen

Abstract:

In this paper, there is concentration on collaborative transportation planning (CTP) among multiple carriers with pickup and delivery requests and time windows. This problem is a vehicle routing problem with constraints from standard vehicle routing problems and new constraints from a real-world application. In the problem, each carrier has a finite number of vehicles, and each request is a pickup and delivery request with time window. Moreover, each carrier has reserved requests, which must be served by itself, whereas its exchangeable requests can be outsourced to and served by other carriers. This collaboration among carriers can help them to reduce total transportation costs. A mixed integer programming model is proposed to the problem. To solve the model, a hybrid algorithm that combines Genetic Algorithm and Simulated Annealing (GASA) is proposed. This algorithm takes advantages of GASA at the same time. After tuning the parameters of the algorithm with the Taguchi method, the experiments are conducted and experimental results are provided for the hybrid algorithm. The results are compared with those obtained by a commercial solver. The comparison indicates that the GASA significantly outperforms the commercial solver.

Keywords: centralized collaborative transportation, collaborative transportation with pickup and delivery, collaborative transportation with time windows, hybrid algorithm of GA and SA

Procedia PDF Downloads 392
21022 Joint Training Offer Selection and Course Timetabling Problems: Models and Algorithms

Authors: Gianpaolo Ghiani, Emanuela Guerriero, Emanuele Manni, Alessandro Romano

Abstract:

In this article, we deal with a variant of the classical course timetabling problem that has a practical application in many areas of education. In particular, in this paper we are interested in high schools remedial courses. The purpose of such courses is to provide under-prepared students with the skills necessary to succeed in their studies. In particular, a student might be under prepared in an entire course, or only in a part of it. The limited availability of funds, as well as the limited amount of time and teachers at disposal, often requires schools to choose which courses and/or which teaching units to activate. Thus, schools need to model the training offer and the related timetabling, with the goal of ensuring the highest possible teaching quality, by meeting the above-mentioned financial, time and resources constraints. Moreover, there are some prerequisites between the teaching units that must be satisfied. We first present a Mixed-Integer Programming (MIP) model to solve this problem to optimality. However, the presence of many peculiar constraints contributes inevitably in increasing the complexity of the mathematical model. Thus, solving it through a general purpose solver may be performed for small instances only, while solving real-life-sized instances of such model requires specific techniques or heuristic approaches. For this purpose, we also propose a heuristic approach, in which we make use of a fast constructive procedure to obtain a feasible solution. To assess our exact and heuristic approaches we perform extensive computational results on both real-life instances (obtained from a high school in Lecce, Italy) and randomly generated instances. Our tests show that the MIP model is never solved to optimality, with an average optimality gap of 57%. On the other hand, the heuristic algorithm is much faster (in about the 50% of the considered instances it converges in approximately half of the time limit) and in many cases allows achieving an improvement on the objective function value obtained by the MIP model. Such an improvement ranges between 18% and 66%.

Keywords: heuristic, MIP model, remedial course, school, timetabling

Procedia PDF Downloads 605
21021 Removal of Heavy Metal Using Continous Mode

Authors: M. Abd elfattah, M. Ossman, Nahla A. Taha

Abstract:

The present work explored the use of Egyptian rice straw, an agricultural waste that leads to global warming problem through brown cloud, as a potential feedstock for the preparation of activated carbon by physical and chemical activation. The results of this study showed that it is feasible to prepare activated carbons with relatively high surface areas and pore volumes from the Egyptian rice straw by direct chemical and physical activation. The produced activated carbon from the two methods (AC1 and AC2) could be used as potential adsorbent for the removal of Fe(III) from aqueous solution contains heavy metals and polluted water. The adsorption of Fe(III) was depended on the pH of the solution. The optimal Fe(III) removal efficiency occurs at pH 5. Based on the results, the optimum contact time is 60 minutes and adsorbent dosage is 3 g/L. The adsorption breakthrough curves obtained at different bed depths indicated increase of breakthrough time with increase in bed depths. A rise in inlet Fe(III) concentration reduces the throughput volume before the packed bed gets saturated. AC1 showed higher affinity for Fe(III) as compared to Raw rice husk.

Keywords: rice straw, activated carbon, Fe(III), fixed bed column, pyrolysis

Procedia PDF Downloads 249
21020 A Numerical Investigation of Lamb Wave Damage Diagnosis for Composite Delamination Using Instantaneous Phase

Authors: Haode Huo, Jingjing He, Rui Kang, Xuefei Guan

Abstract:

This paper presents a study of Lamb wave damage diagnosis of composite delamination using instantaneous phase data. Numerical experiments are performed using the finite element method. Different sizes of delamination damages are modeled using finite element package ABAQUS. Lamb wave excitation and responses data are obtained using a pitch-catch configuration. Empirical mode decomposition is employed to extract the intrinsic mode functions (IMF). Hilbert–Huang Transform is applied to each of the resulting IMFs to obtain the instantaneous phase information. The baseline data for healthy plates are also generated using the same procedure. The size of delamination is correlated with the instantaneous phase change for damage diagnosis. It is observed that the unwrapped instantaneous phase of shows a consistent behavior with the increasing delamination size.

Keywords: delamination, lamb wave, finite element method, EMD, instantaneous phase

Procedia PDF Downloads 320