Search results for: video modelling
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2774

Search results for: video modelling

1814 Modeling the Downstream Impacts of River Regulation on the Grand Lake Meadows Complex using Delft3D FM Suite

Authors: Jaime Leavitt, Katy Haralampides

Abstract:

Numerical modelling has been used to investigate the long-term impact of a large dam on downstream wetland areas, specifically in terms of changing sediment dynamics in the system. The Mactaquac Generating Station (MQGS) is a 672MW run-of-the-river hydroelectric facility, commissioned in 1968 on the mainstem of the Wolastoq|Saint John River in New Brunswick, Canada. New Brunswick Power owns and operates the dam and has been working closely with the Canadian Rivers Institute at UNB Fredericton on a multi-year, multi-disciplinary project investigating the impact the dam has on its surrounding environment. With focus on the downstream river, this research discusses the initialization, set-up, calibration, and preliminary results of a 2-D hydrodynamic model using the Delft3d Flexible Mesh Suite (successor of the Delft3d 4 Suite). The flexible mesh allows the model grid to be structured in the main channel and unstructured in the floodplains and other downstream regions with complex geometry. The combination of grid types improves computational time and output. As the movement of water governs the movement of sediment, the calibrated and validated hydrodynamic model was applied to sediment transport simulations, particularly of the fine suspended sediments. Several provincially significant Protected Natural Areas and federally significant National Wildlife Areas are located 60km downstream of the MQGS. These broad, low-lying floodplains and wetlands are known as the Grand Lake Meadows Complex (GLM Complex). There is added pressure to investigate the impacts of river regulation on these protected regions that rely heavily on natural river processes like sediment transport and flooding. It is hypothesized that the fine suspended sediment would naturally travel to the floodplains for nutrient deposition and replenishment, particularly during the freshet and large storms. The purpose of this research is to investigate the impacts of river regulation on downstream environments and use the model as a tool for informed decision making to protect and maintain biologically productive wetlands and floodplains.

Keywords: hydrodynamic modelling, national wildlife area, protected natural area, sediment transport.

Procedia PDF Downloads 6
1813 Multi-Scale Damage Modelling for Microstructure Dependent Short Fiber Reinforced Composite Structure Design

Authors: Joseph Fitoussi, Mohammadali Shirinbayan, Abbas Tcharkhtchi

Abstract:

Due to material flow during processing, short fiber reinforced composites structures obtained by injection or compression molding generally present strong spatial microstructure variation. On the other hand, quasi-static, dynamic, and fatigue behavior of these materials are highly dependent on microstructure parameters such as fiber orientation distribution. Indeed, because of complex damage mechanisms, SFRC structures design is a key challenge for safety and reliability. In this paper, we propose a micromechanical model allowing prediction of damage behavior of real structures as a function of microstructure spatial distribution. To this aim, a statistical damage criterion including strain rate and fatigue effect at the local scale is introduced into a Mori and Tanaka model. A critical local damage state is identified, allowing fatigue life prediction. Moreover, the multi-scale model is coupled with an experimental intrinsic link between damage under monotonic loading and fatigue life in order to build an abacus giving Tsai-Wu failure criterion parameters as a function of microstructure and targeted fatigue life. On the other hand, the micromechanical damage model gives access to the evolution of the anisotropic stiffness tensor of SFRC submitted to complex thermomechanical loading, including quasi-static, dynamic, and cyclic loading with temperature and amplitude variations. Then, the latter is used to fill out microstructure dependent material cards in finite element analysis for design optimization in the case of complex loading history. The proposed methodology is illustrated in the case of a real automotive component made of sheet molding compound (PSA 3008 tailgate). The obtained results emphasize how the proposed micromechanical methodology opens a new path for the automotive industry to lighten vehicle bodies and thereby save energy and reduce gas emission.

Keywords: short fiber reinforced composite, structural design, damage, micromechanical modelling, fatigue, strain rate effect

Procedia PDF Downloads 107
1812 Justyna Skrzyńska, Zdzisław Kobos, Zbigniew Wochyński

Authors: Vahid Bairami Rad

Abstract:

Due to the tremendous progress in computer technology in the last decades, the capabilities of computers increased enormously and working with a computer became a normal activity for nearly everybody. With all the possibilities a computer can offer, humans and their interaction with computers are now a limiting factor. This gave rise to a lot of research in the field of HCI (human computer interaction) aiming to make interaction easier, more intuitive, and more efficient. To research eye gaze based interfaces it is necessary to understand both sides of the interaction–the human eye and the eye tracker. The first section gives an overview on the anatomy of the eye. The second section accuracy and calibration issue. The subsequent section presents data from a user study where eye movements have been recorded while watching a video and while surfing the Internet. Statistics on the eye movement during these tasks for several individuals provide typical values and ranges for fixation times and saccade lengths and are the foundation for discussions in later chapters. The data also reveal typical limitations of eye trackers.

Keywords: human computer interaction, gaze tracking, calibration, eye movement

Procedia PDF Downloads 537
1811 Human Posture Estimation Based on Multiple Viewpoints

Authors: Jiahe Liu, HongyangYu, Feng Qian, Miao Luo

Abstract:

This study aimed to address the problem of improving the confidence of key points by fusing multi-view information, thereby estimating human posture more accurately. We first obtained multi-view image information and then used the MvP algorithm to fuse this multi-view information together to obtain a set of high-confidence human key points. We used these as the input for the Spatio-Temporal Graph Convolution (ST-GCN). ST-GCN is a deep learning model used for processing spatio-temporal data, which can effectively capture spatio-temporal relationships in video sequences. By using the MvP algorithm to fuse multi-view information and inputting it into the spatio-temporal graph convolution model, this study provides an effective method to improve the accuracy of human posture estimation and provides strong support for further research and application in related fields.

Keywords: multi-view, pose estimation, ST-GCN, joint fusion

Procedia PDF Downloads 70
1810 The Rite of Jihadification in ISIS Modified Video Games: Mass Deception and Dialectic of Religious Regression in Technological Progression

Authors: Venus Torabi

Abstract:

ISIS, the terrorist organization, modified two videogames, ARMA III and Grand Theft Auto 5 (2013) as means of online recruitment and ideological propaganda. The urge to study the mechanism at work, whether it has been successful or not, derives (Digital) Humanities experts to explore how codes of terror, Islamic ideology and recruitment strategies are incorporated into the ludic mechanics of videogames. Another aspect of the significance lies in the fact that this is a latent problem that has not been fully addressed in an interdisciplinary framework prior to this study, to the best of the researcher’s knowledge. Therefore, due to the complexity of the subject, the present paper entangles with game studies, philosophical and religious poles to form the methodology of conducting the research. As a contextualized epistemology of such exploitation of videogames, the core argument is building on the notion of “Culture Industry” proposed by Theodore W. Adorno and Max Horkheimer in Dialectic of Enlightenment (2002). This article posits that the ideological underpinnings of ISIS’s cause corroborated by the action-bound mechanics of the videogames are in line with adhering to the Islamic Eschatology as a furnishing ground and an excuse in exercising terrorism. It is an account of ISIS’s modification of the videogames, a tool of technological progression to practice online radicalization. Dialectically, this practice is packed up in rhetoric for recognizing a religious myth (the advent of a savior), as a hallmark of regression. The study puts forth that ISIS’s wreaking havoc on the world, both in reality and within action videogames, is negotiating the process of self-assertion in the players of such videogames (by assuming one’s self a member of terrorists) that leads to self-annihilation. It tries to unfold how ludic Mod videogames are misused as tools of mass deception towards ethnic cleansing in reality and line with the distorted Eschatological myth. To conclude, this study posits videogames to be a new avenue of mass deception in the framework of the Culture Industry. Yet, this emerges as a two-edged sword of mass deception in ISIS’s modification of videogames. It shows that ISIS is not only trying to hijack the minds through online/ludic recruitment, it potentially deceives the Muslim communities or those prone to radicalization into believing that it's terrorist practices are preparing the world for the advent of a religious savior based on Islamic Eschatology. This is to claim that the harsh actions of the videogames are potentially breeding minds by seeds of terrorist propaganda and numbing them to violence. The real world becomes an extension of that harsh virtual environment in a ludic/actual continuum, the extension that is contributing to the mass deception mechanism of the terrorists, in a clandestine trend.

Keywords: culture industry, dialectic, ISIS, islamic eschatology, mass deception, video games

Procedia PDF Downloads 137
1809 Air Handling Units Power Consumption Using Generalized Additive Model for Anomaly Detection: A Case Study in a Singapore Campus

Authors: Ju Peng Poh, Jun Yu Charles Lee, Jonathan Chew Hoe Khoo

Abstract:

The emergence of digital twin technology, a digital replica of physical world, has improved the real-time access to data from sensors about the performance of buildings. This digital transformation has opened up many opportunities to improve the management of the building by using the data collected to help monitor consumption patterns and energy leakages. One example is the integration of predictive models for anomaly detection. In this paper, we use the GAM (Generalised Additive Model) for the anomaly detection of Air Handling Units (AHU) power consumption pattern. There is ample research work on the use of GAM for the prediction of power consumption at the office building and nation-wide level. However, there is limited illustration of its anomaly detection capabilities, prescriptive analytics case study, and its integration with the latest development of digital twin technology. In this paper, we applied the general GAM modelling framework on the historical data of the AHU power consumption and cooling load of the building between Jan 2018 to Aug 2019 from an education campus in Singapore to train prediction models that, in turn, yield predicted values and ranges. The historical data are seamlessly extracted from the digital twin for modelling purposes. We enhanced the utility of the GAM model by using it to power a real-time anomaly detection system based on the forward predicted ranges. The magnitude of deviation from the upper and lower bounds of the uncertainty intervals is used to inform and identify anomalous data points, all based on historical data, without explicit intervention from domain experts. Notwithstanding, the domain expert fits in through an optional feedback loop through which iterative data cleansing is performed. After an anomalously high or low level of power consumption detected, a set of rule-based conditions are evaluated in real-time to help determine the next course of action for the facilities manager. The performance of GAM is then compared with other approaches to evaluate its effectiveness. Lastly, we discuss the successfully deployment of this approach for the detection of anomalous power consumption pattern and illustrated with real-world use cases.

Keywords: anomaly detection, digital twin, generalised additive model, GAM, power consumption, supervised learning

Procedia PDF Downloads 154
1808 Modelling of the Linear Operator in the Representation of the Function of Wave of a Micro Particle

Authors: Mohammedi Ferhate

Abstract:

This paper deals with the generalized the notion of the function of wave a micro particle moving free, the concept of the linear operator in the representation function delta of Dirac which is a generalization of the symbol of Kronecker to the case of a continuous variation of the sizes concerned with the condition of orthonormation of the Eigen functions the use of linear operators and their Eigen functions in connection with the solution of given differential equations, it is of interest to study the properties of the operators themselves and determine which of them follow purely from the nature of the operators, without reference to specific forms of Eigen functions. The models simulation examples are also presented.

Keywords: function, operator, simulation, wave

Procedia PDF Downloads 146
1807 On the Problems of Human Concept Learning within Terminological Systems

Authors: Farshad Badie

Abstract:

The central focus of this article is on the fact that knowledge is constructed from an interaction between humans’ experiences and over their conceptions of constructed concepts. Logical characterisation of ‘human inductive learning over human’s constructed concepts’ within terminological systems and providing a logical background for theorising over the Human Concept Learning Problem (HCLP) in terminological systems are the main contributions of this research. This research connects with the topics ‘human learning’, ‘epistemology’, ‘cognitive modelling’, ‘knowledge representation’ and ‘ontological reasoning’.

Keywords: human concept learning, concept construction, knowledge construction, terminological systems

Procedia PDF Downloads 325
1806 Review on PETG Material Parts Made Using Fused Deposition Modeling

Authors: Dhval Chauhan, Mahesh Chudasama

Abstract:

This study has been undertaken to give a review of Polyethylene Terephthalate Glycol (PETG) material used in Fused Deposition Modelling (FDM). This paper offers a review of the existing literature on polyethylene terephthalate glycol (PETG) material, the objective of the paper is to providing guidance on different process parameters that can be used to improve the strength of the part by performing various testing like tensile, compressive, flexural, etc. This work is target to find new paths that can be used for further development of the use of fiber reinforcement in PETG material.

Keywords: PETG, FDM, tensile strength, flexural strength, fiber reinforcement

Procedia PDF Downloads 192
1805 Comparative Analysis of Universal Filtered Multi Carrier and Filtered Orthogonal Frequency Division Multiplexing Systems for Wireless Communications

Authors: Raja Rajeswari K

Abstract:

Orthogonal Frequency Division Multiplexing (OFDM), a multi Carrier transmission technique that has been used in implementing the majority of wireless applications like Wireless Network Protocol Standards (like IEEE 802.11a, IEEE 802.11n), in telecommunications (like LTE, LTE-Advanced) and also in Digital Audio & Video Broadcast standards. The latest research and development in the area of orthogonal frequency division multiplexing, Universal Filtered Multi Carrier (UFMC) & Filtered OFDM (F-OFDM) has attracted lots of attention for wideband wireless communications. In this paper UFMC & F-OFDM system are implemented and comparative analysis are carried out in terms of M-ary QAM modulation scheme over Dolph-chebyshev filter & rectangular window filter and to estimate Bit Error Rate (BER) over Rayleigh fading channel.

Keywords: UFMC, F-OFDM, BER, M-ary QAM

Procedia PDF Downloads 169
1804 On Consolidated Predictive Model of the Natural History of Breast Cancer Considering Primary Tumor and Primary Distant Metastases Growth

Authors: Ella Tyuryumina, Alexey Neznanov

Abstract:

Finding algorithms to predict the growth of tumors has piqued the interest of researchers ever since the early days of cancer research. A number of studies were carried out as an attempt to obtain reliable data on the natural history of breast cancer growth. Mathematical modeling can play a very important role in the prognosis of tumor process of breast cancer. However, mathematical models describe primary tumor growth and metastases growth separately. Consequently, we propose a mathematical growth model for primary tumor and primary metastases which may help to improve predicting accuracy of breast cancer progression using an original mathematical model referred to CoM-IV and corresponding software. We are interested in: 1) modelling the whole natural history of primary tumor and primary metastases; 2) developing adequate and precise CoM-IV which reflects relations between PT and MTS; 3) analyzing the CoM-IV scope of application; 4) implementing the model as a software tool. The CoM-IV is based on exponential tumor growth model and consists of a system of determinate nonlinear and linear equations; corresponds to TNM classification. It allows to calculate different growth periods of primary tumor and primary metastases: 1) ‘non-visible period’ for primary tumor; 2) ‘non-visible period’ for primary metastases; 3) ‘visible period’ for primary metastases. The new predictive tool: 1) is a solid foundation to develop future studies of breast cancer models; 2) does not require any expensive diagnostic tests; 3) is the first predictor which makes forecast using only current patient data, the others are based on the additional statistical data. Thus, the CoM-IV model and predictive software: a) detect different growth periods of primary tumor and primary metastases; b) make forecast of the period of primary metastases appearance; c) have higher average prediction accuracy than the other tools; d) can improve forecasts on survival of BC and facilitate optimization of diagnostic tests. The following are calculated by CoM-IV: the number of doublings for ‘nonvisible’ and ‘visible’ growth period of primary metastases; tumor volume doubling time (days) for ‘nonvisible’ and ‘visible’ growth period of primary metastases. The CoM-IV enables, for the first time, to predict the whole natural history of primary tumor and primary metastases growth on each stage (pT1, pT2, pT3, pT4) relying only on primary tumor sizes. Summarizing: a) CoM-IV describes correctly primary tumor and primary distant metastases growth of IV (T1-4N0-3M1) stage with (N1-3) or without regional metastases in lymph nodes (N0); b) facilitates the understanding of the appearance period and manifestation of primary metastases.

Keywords: breast cancer, exponential growth model, mathematical modelling, primary metastases, primary tumor, survival

Procedia PDF Downloads 335
1803 Development of Market Penetration for High Energy Efficiency Technologies in Alberta’s Residential Sector

Authors: Saeidreza Radpour, Md. Alam Mondal, Amit Kumar

Abstract:

Market penetration of high energy efficiency technologies has key impacts on energy consumption and GHG mitigation. Also, it will be useful to manage the policies formulated by public or private organizations to achieve energy or environmental targets. Energy intensity in residential sector of Alberta was 148.8 GJ per household in 2012 which is 39% more than the average of Canada 106.6 GJ, it was the highest amount among the provinces on per household energy consumption. Energy intensity by appliances of Alberta was 15.3 GJ per household in 2012 which is 14% higher than average value of other provinces and territories in energy demand intensity by appliances in Canada. In this research, a framework has been developed to analyze the market penetration and market share of high energy efficiency technologies in residential sector. The overall methodology was based on development of data-intensive models’ estimation of the market penetration of the appliances in the residential sector over a time period. The developed models were a function of a number of macroeconomic and technical parameters. Developed mathematical equations were developed based on twenty-two years of historical data (1990-2011). The models were analyzed through a series of statistical tests. The market shares of high efficiency appliances were estimated based on the related variables such as capital and operating costs, discount rate, appliance’s life time, annual interest rate, incentives and maximum achievable efficiency in the period of 2015 to 2050. Results show that the market penetration of refrigerators is higher than that of other appliances. The stocks of refrigerators per household are anticipated to increase from 1.28 in 2012 to 1.314 and 1.328 in 2030 and 2050, respectively. Modelling results show that the market penetration rate of stand-alone freezers will decrease between 2012 and 2050. Freezer stock per household will decline from 0.634 in 2012 to 0.556 and 0.515 in 2030 and 2050, respectively. The stock of dishwashers per household is expected to increase from 0.761 in 2012 to 0.865 and 0.960 in 2030 and 2050, respectively. The increase in the market penetration rate of clothes washers and clothes dryers is nearly parallel. The stock of clothes washers and clothes dryers per household is expected to rise from 0.893 and 0.979 in 2012 to 0.960 and 1.0 in 2050, respectively. This proposed presentation will include detailed discussion on the modelling methodology and results.

Keywords: appliances efficiency improvement, energy star, market penetration, residential sector

Procedia PDF Downloads 285
1802 Impact Location From Instrumented Mouthguard Kinematic Data In Rugby

Authors: Jazim Sohail, Filipe Teixeira-Dias

Abstract:

Mild traumatic brain injury (mTBI) within non-helmeted contact sports is a growing concern due to the serious risk of potential injury. Extensive research is being conducted looking into head kinematics in non-helmeted contact sports utilizing instrumented mouthguards that allow researchers to record accelerations and velocities of the head during and after an impact. This does not, however, allow the location of the impact on the head, and its magnitude and orientation, to be determined. This research proposes and validates two methods to quantify impact locations from instrumented mouthguard kinematic data, one using rigid body dynamics, the other utilizing machine learning. The rigid body dynamics technique focuses on establishing and matching moments from Euler’s and torque equations in order to find the impact location on the head. The methodology is validated with impact data collected from a lab test with the dummy head fitted with an instrumented mouthguard. Additionally, a Hybrid III Dummy head finite element model was utilized to create synthetic kinematic data sets for impacts from varying locations to validate the impact location algorithm. The algorithm calculates accurate impact locations; however, it will require preprocessing of live data, which is currently being done by cross-referencing data timestamps to video footage. The machine learning technique focuses on eliminating the preprocessing aspect by establishing trends within time-series signals from instrumented mouthguards to determine the impact location on the head. An unsupervised learning technique is used to cluster together impacts within similar regions from an entire time-series signal. The kinematic signals established from mouthguards are converted to the frequency domain before using a clustering algorithm to cluster together similar signals within a time series that may span the length of a game. Impacts are clustered within predetermined location bins. The same Hybrid III Dummy finite element model is used to create impacts that closely replicate on-field impacts in order to create synthetic time-series datasets consisting of impacts in varying locations. These time-series data sets are used to validate the machine learning technique. The rigid body dynamics technique provides a good method to establish accurate impact location of impact signals that have already been labeled as true impacts and filtered out of the entire time series. However, the machine learning technique provides a method that can be implemented with long time series signal data but will provide impact location within predetermined regions on the head. Additionally, the machine learning technique can be used to eliminate false impacts captured by sensors saving additional time for data scientists using instrumented mouthguard kinematic data as validating true impacts with video footage would not be required.

Keywords: head impacts, impact location, instrumented mouthguard, machine learning, mTBI

Procedia PDF Downloads 217
1801 PID Control of Quad-Rotor Unnamed Vehicle Based on Lagrange Approach Modelling

Authors: A. Benbouali, H. Saidi, A. Derrouazin, T. Bessaad

Abstract:

Aerial robotics is a very exciting research field dealing with a variety of subjects, including the attitude control. This paper deals with the control of a four rotor vertical take-off and landing (VTOL) Unmanned Aerial Vehicle. The paper presents a mathematical model based on the approach of Lagrange for the flight control of an autonomous quad-rotor. It also describes the controller architecture which is based on PID regulators. The control method has been simulated in closed loop in different situations. All the calculation stages and the simulation results have been detailed.

Keywords: quad-rotor, lagrange approach, proportional integral derivate (PID) controller, Matlab/Simulink

Procedia PDF Downloads 400
1800 The Effects of Online Video Gaming on Creativity

Authors: Chloe Shu-Hua Yeh

Abstract:

Effects of videogame play on players cognitive abilities is a growing research field in the recent decades, however, little is known about how ‘out-of-school’ use of videogame influences creativity. This interdisciplinary research explores the cognitive and emotional effects of two different types of online videogames (an action videogame and a non-action videogame) on subsequent creativity performances using a within-participant design study with 36 participants. Results showed that after playing the action game participants performed higher originality, elaboration and flexibility than after playing the causal game. The results explored effects of emotional states elicited during playing the games suggesting that arousal may be a significant emotional factor which influence subsequent creativity performance. The cognitive and emotional effects of videogame were discussed followed with implications for emotion-creativity-videogame play research, game designers, educational practitioners and parents.

Keywords: attentional breadth, creativity, emotion, videogame play

Procedia PDF Downloads 531
1799 Audio-Visual Co-Data Processing Pipeline

Authors: Rita Chattopadhyay, Vivek Anand Thoutam

Abstract:

Speech is the most acceptable means of communication where we can quickly exchange our feelings and thoughts. Quite often, people can communicate orally but cannot interact or work with computers or devices. It’s easy and quick to give speech commands than typing commands to computers. In the same way, it’s easy listening to audio played from a device than extract output from computers or devices. Especially with Robotics being an emerging market with applications in warehouses, the hospitality industry, consumer electronics, assistive technology, etc., speech-based human-machine interaction is emerging as a lucrative feature for robot manufacturers. Considering this factor, the objective of this paper is to design the “Audio-Visual Co-Data Processing Pipeline.” This pipeline is an integrated version of Automatic speech recognition, a Natural language model for text understanding, object detection, and text-to-speech modules. There are many Deep Learning models for each type of the modules mentioned above, but OpenVINO Model Zoo models are used because the OpenVINO toolkit covers both computer vision and non-computer vision workloads across Intel hardware and maximizes performance, and accelerates application development. A speech command is given as input that has information about target objects to be detected and start and end times to extract the required interval from the video. Speech is converted to text using the Automatic speech recognition QuartzNet model. The summary is extracted from text using a natural language model Generative Pre-Trained Transformer-3 (GPT-3). Based on the summary, essential frames from the video are extracted, and the You Only Look Once (YOLO) object detection model detects You Only Look Once (YOLO) objects on these extracted frames. Frame numbers that have target objects (specified objects in the speech command) are saved as text. Finally, this text (frame numbers) is converted to speech using text to speech model and will be played from the device. This project is developed for 80 You Only Look Once (YOLO) labels, and the user can extract frames based on only one or two target labels. This pipeline can be extended for more than two target labels easily by making appropriate changes in the object detection module. This project is developed for four different speech command formats by including sample examples in the prompt used by Generative Pre-Trained Transformer-3 (GPT-3) model. Based on user preference, one can come up with a new speech command format by including some examples of the respective format in the prompt used by the Generative Pre-Trained Transformer-3 (GPT-3) model. This pipeline can be used in many projects like human-machine interface, human-robot interaction, and surveillance through speech commands. All object detection projects can be upgraded using this pipeline so that one can give speech commands and output is played from the device.

Keywords: OpenVINO, automatic speech recognition, natural language processing, object detection, text to speech

Procedia PDF Downloads 80
1798 The Democratization of 3D Capturing: An Application Investigating Google Tango Potentials

Authors: Carlo Bianchini, Lorenzo Catena

Abstract:

The appearance of 3D scanners and then, more recently, of image-based systems that generate point clouds directly from common digital images have deeply affected the survey process in terms of both capturing and 2D/3D modelling. In this context, low cost and mobile systems are increasingly playing a key role and actually paving the way to the democratization of what in the past was the realm of few specialized technicians and expensive equipment. The application of Google Tango on the ancient church of Santa Maria delle Vigne in Pratica di Mare – Rome presented in this paper is one of these examples.

Keywords: the architectural survey, augmented/mixed/virtual reality, Google Tango project, image-based 3D capturing

Procedia PDF Downloads 148
1797 A Generalisation of Pearson's Curve System and Explicit Representation of the Associated Density Function

Authors: S. B. Provost, Hossein Zareamoghaddam

Abstract:

A univariate density approximation technique whereby the derivative of the logarithm of a density function is assumed to be expressible as a rational function is introduced. This approach which extends Pearson’s curve system is solely based on the moments of a distribution up to a determinable order. Upon solving a system of linear equations, the coefficients of the polynomial ratio can readily be identified. An explicit solution to the integral representation of the resulting density approximant is then obtained. It will be explained that when utilised in conjunction with sample moments, this methodology lends itself to the modelling of ‘big data’. Applications to sets of univariate and bivariate observations will be presented.

Keywords: density estimation, log-density, moments, Pearson's curve system

Procedia PDF Downloads 281
1796 Characterization and Modelling of Groundwater Flow towards a Public Drinking Water Well Field: A Case Study of Ter Kamerenbos Well Field

Authors: Buruk Kitachew Wossenyeleh

Abstract:

Groundwater is the largest freshwater reservoir in the world. Like the other reservoirs of the hydrologic cycle, it is a finite resource. This study focused on the groundwater modeling of the Ter Kamerenbos well field to understand the groundwater flow system and the impact of different scenarios. The study area covers 68.9Km2 in the Brussels Capital Region and is situated in two river catchments, i.e., Zenne River and Woluwe Stream. The aquifer system has three layers, but in the modeling, they are considered as one layer due to their hydrogeological properties. The catchment aquifer system is replenished by direct recharge from rainfall. The groundwater recharge of the catchment is determined using the spatially distributed water balance model called WetSpass, and it varies annually from zero to 340mm. This groundwater recharge is used as the top boundary condition for the groundwater modeling of the study area. During the groundwater modeling using Processing MODFLOW, constant head boundary conditions are used in the north and south boundaries of the study area. For the east and west boundaries of the study area, head-dependent flow boundary conditions are used. The groundwater model is calibrated manually and automatically using observed hydraulic heads in 12 observation wells. The model performance evaluation showed that the root means the square error is 1.89m and that the NSE is 0.98. The head contour map of the simulated hydraulic heads indicates the flow direction in the catchment, mainly from the Woluwe to Zenne catchment. The simulated head in the study area varies from 13m to 78m. The higher hydraulic heads are found in the southwest of the study area, which has the forest as a land-use type. This calibrated model was run for the climate change scenario and well operation scenario. Climate change may cause the groundwater recharge to increase by 43% and decrease by 30% in 2100 from current conditions for the high and low climate change scenario, respectively. The groundwater head varies for a high climate change scenario from 13m to 82m, whereas for a low climate change scenario, it varies from 13m to 76m. If doubling of the pumping discharge assumed, the groundwater head varies from 13m to 76.5m. However, if the shutdown of the pumps is assumed, the head varies in the range of 13m to 79m. It is concluded that the groundwater model is done in a satisfactory way with some limitations, and the model output can be used to understand the aquifer system under steady-state conditions. Finally, some recommendations are made for the future use and improvement of the model.

Keywords: Ter Kamerenbos, groundwater modelling, WetSpass, climate change, well operation

Procedia PDF Downloads 152
1795 Evaluation of Traditional Methods in Construction and Their Effects on Reinforced-Concrete Buildings Behavior

Authors: E. H. N. Gashti, M. Zarrini, M. Irannezhad, J. R. Langroudi

Abstract:

Using ETABS software, this study analyzed 23 buildings to evaluate effects of mistakes during construction phase on buildings structural behavior. For modelling, two different loadings were assumed: 1) design loading and 2) loading due to the effects of mistakes in construction phase. Research results determined that considering traditional construction methods for buildings resulted in a significant increase in dead loads and consequently intensified the displacements and base-shears of buildings under seismic loads.

Keywords: reinforced-concrete buildings, construction mistakes, base-shear, displacements, failure

Procedia PDF Downloads 270
1794 Gamification of a Business Intelligence Tool

Authors: Stephen Miller

Abstract:

The act of applying game mechanics and dynamics (which have been traditionally used in video games) into business applications is being widely trialed in an effort to make conventional business software a bit more participative, fun and engaging. This new trend, named ‘gamification’ has its believers and of course, its critics who still need convincing that the concept is an effective and beneficial business tool worthy of investment. The literature reveals that user engagement of business intelligence (BI) tools is much lower than expected and investors are failing to get a good return on their investment (ROI). So, a software prototype will be designed and developed to add gamification to a BI tool to determine its effect upon the user engagement levels of test participants. The experimental study will be evaluated using the comprehensive User Engagement Scale (UES) to see if there are improvements in areas such as; aesthetics, perceived usability, endurability, novelty, felt involvement and focused attention. The results of this unique study should demonstrate whether or not ‘gamifying’ a BI tool has the potential to increase an individual’s motivation to use BI software more often.

Keywords: business intelligence, gamification, human computer interaction, user engagement

Procedia PDF Downloads 585
1793 Prediction of Soil Liquefaction by Using UBC3D-PLM Model in PLAXIS

Authors: A. Daftari, W. Kudla

Abstract:

Liquefaction is a phenomenon in which the strength and stiffness of a soil is reduced by earthquake shaking or other rapid cyclic loading. Liquefaction and related phenomena have been responsible for huge amounts of damage in historical earthquakes around the world. Modelling of soil behaviour is the main step in soil liquefaction prediction process. Nowadays, several constitutive models for sand have been presented. Nevertheless, only some of them can satisfy this mechanism. One of the most useful models in this term is UBCSAND model. In this research, the capability of this model is considered by using PLAXIS software. The real data of superstition hills earthquake 1987 in the Imperial Valley was used. The results of the simulation have shown resembling trend of the UBC3D-PLM model.

Keywords: liquefaction, plaxis, pore-water pressure, UBC3D-PLM

Procedia PDF Downloads 310
1792 Towards a Computational Model of Consciousness: Global Abstraction Workspace

Authors: Halim Djerroud, Arab Ali Cherif

Abstract:

We assume that conscious functions are implemented automatically. In other words that consciousness as well as the non-consciousness aspect of human thought, planning, and perception, are produced by biologically adaptive algorithms. We propose that the mechanisms of consciousness can be produced using similar adaptive algorithms to those executed by the mechanism. In this paper, we propose a computational model of consciousness, the ”Global Abstraction Workspace” which is an internal environmental modelling perceived as a multi-agent system. This system is able to evolve and generate new data and processes as well as actions in the environment.

Keywords: artificial consciousness, cognitive architecture, global abstraction workspace, multi-agent system

Procedia PDF Downloads 340
1791 Kinetic Modelling of Drying Process of Jumbo Squid (Dosidicus Gigas) Slices Subjected to an Osmotic Pretreatment under High Pressure

Authors: Mario Perez-Won, Roberto Lemus-Mondaca, Constanza Olivares-Rivera, Fernanda Marin-Monardez

Abstract:

This research presents the simultaneous application of high hydrostatic pressure (HHP) and osmotic dehydration (DO) as a pretreatment to hot –air drying of jumbo squid (Dosidicus gigas) cubes. The drying time was reduced to 2 hours at 60ºC and 5 hours at 40°C as compared to the jumbo squid samples untreated. This one was due to osmotic pressure under high-pressure treatment where increased salt saturation what caused an increasing water loss. Thus, a more reduced time during convective drying was reached, and so water effective diffusion in drying would play an important role in this research. Different working conditions such as pressure (350-550 MPa), pressure time (5-10 min), salt concentration, NaCl (10 y 15%) and drying temperature (40-60ºC) were optimized according to kinetic parameters of each mathematical model. The models used for drying experimental curves were those corresponding to Weibull, Page and Logarithmic models, however, the latest one was the best fitted to the experimental data. The values for water effective diffusivity varied from 4.82 to 6.59x10-9 m2/s for the 16 curves (DO+HHP) whereas the control samples obtained a value of 1.76 and 5.16×10-9 m2/s, for 40 and 60°C, respectively. On the other hand, quality characteristics such as color, texture, non-enzymatic browning, water holding capacity (WHC) and rehydration capacity (RC) were assessed. The L* (lightness) color parameter increased, however, b * (yellowish) and a* (reddish) parameters decreased for the DO+HHP treated samples, indicating treatment prevents sample browning. The texture parameters such as hardness and elasticity decreased, but chewiness increased with treatment, which resulted in a product with a higher tenderness and less firmness compared to the untreated sample. Finally, WHC and RC values of the most treatments increased owing to a minor damage in tissue cellular compared to untreated samples. Therefore, a knowledge regarding to the drying kinetic as well as quality characteristics of dried jumbo squid samples subjected to a pretreatment of osmotic dehydration under high hydrostatic pressure is extremely important to an industrial level so that the drying process can be successful at different pretreatment conditions and/or variable processes.

Keywords: diffusion coefficient, drying process, high pressure, jumbo squid, modelling, quality aspects

Procedia PDF Downloads 245
1790 Identification of Classes of Bilinear Time Series Models

Authors: Anthony Usoro

Abstract:

In this paper, two classes of bilinear time series model are obtained under certain conditions from the general bilinear autoregressive moving average model. Bilinear Autoregressive (BAR) and Bilinear Moving Average (BMA) Models have been identified. From the general bilinear model, BAR and BMA models have been proved to exist for q = Q = 0, => j = 0, and p = P = 0, => i = 0 respectively. These models are found useful in modelling most of the economic and financial data.

Keywords: autoregressive model, bilinear autoregressive model, bilinear moving average model, moving average model

Procedia PDF Downloads 407
1789 Screen Casting Instead of Illegible Scribbles: Making a Mini Movie for Feedback on Students’ Scholarly Papers

Authors: Kerri Alderson

Abstract:

There is pervasive awareness by post secondary faculty that written feedback on course assignments is inconsistently reviewed by students. In order to support student success and growth, a novel method of providing feedback was sought, and screen casting - short, narrated “movies” of audio visual instructor feedback on students’ scholarly papers - was provided as an alternative to traditional means. An overview of the teaching and learning experience as well as the user-friendly software utilized will be presented. This study covers an overview of this more direct, student-centered medium for providing feedback using technology familiar to post secondary students. Reminiscent of direct personal contact, the personalized video feedback is positively evaluated by students as a formative medium for student growth in scholarly writing.

Keywords: education, pedagogy, screen casting, student feedback, teaching and learning

Procedia PDF Downloads 119
1788 Use of Extended Conversation to Boost Vocabulary Knowledge and Soft Skills in English for Employment Classes

Authors: James G. Matthew, Seonmin Huh, Frank X. Bennett

Abstract:

English for Specific Purposes, ESP, aims to equip learners with necessary English language skills. Many ESP programs address language skills for job performance, including reading job related documents and oral proficiency. Within ESP is English for occupational purposes, EOP, which centers around developing communicative competence for the globalized workplace. Many ESP and EOP courses lack the content needed to assist students to progress at work, resulting in the need to create lexical compilation for different professions. It is important to teach communicative competence and soft skills for real job-related problem situations and address the complexities of the real world to help students to be successful in their professions. ESP and EOP research is therefore trying to balance both profession-specific educational contents as well as international multi-disciplinary language skills for the globalized workforce. The current study will build upon the existing discussion by developing pedagogy to assist students in their career through developing a strong practical command of relevant English vocabulary. Our research question focuses on the pedagogy two professors incorporated in their English for employment courses. The current study is a qualitative case study on the modes of teaching delivery for EOP in South Korea. Two foreign professors teaching at two different universities in South Korea volunteered for the study to explore their teaching practices. Both professors’ curriculums included the components of employment-related concept vocabulary, business presentations, CV/resume and cover letter preparation, and job interview preparation. All the pre-made recorded video lectures, live online class sessions with students, teachers’ lesson plans, teachers’ class materials, students’ assignments, and midterm and finals video conferences were collected for data analysis. The study then focused on unpacking representative patterns in their teaching methods. The professors used their strengths as native speakers to extend the class discussion from narrow and restricted conversations to giving students broader opportunities to practice authentic English conversation. The methods of teaching utilized three main steps to extend the conversation. Firstly, students were taught concept vocabulary. Secondly, the vocabulary was then combined in speaking activities where students had to solve scenarios, and the students were required to expand on the given forms of words and language expressions. Lastly, the students had conversations in English, using the language learnt. The conversations observed in both classes were those of authentic, expanded English communication and this way of expanding concept vocabulary lessons into extended conversation is one representative pedagogical approach that both professors took. Extended English conversation, therefore, is crucial for EOP education.

Keywords: concept vocabulary, english as a foreign language, english for employment, extended conversation

Procedia PDF Downloads 92
1787 A Block World Problem Based Sudoku Solver

Authors: Luciana Abednego, Cecilia Nugraheni

Abstract:

There are many approaches proposed for solving Sudoku puzzles. One of them is by modelling the puzzles as block world problems. There have been three model for Sudoku solvers based on this approach. Each model expresses Sudoku solver as a parameterized multi agent systems. In this work, we propose a new model which is an improvement over the existing models. This paper presents the development of a Sudoku solver that implements all the proposed models. Some experiments have been conducted to determine the performance of each model.

Keywords: Sudoku puzzle, Sudoku solver, block world problem, parameterized multi agent systems

Procedia PDF Downloads 341
1786 Measuring the Embodied Energy of Construction Materials and Their Associated Cost Through Building Information Modelling

Authors: Ahmad Odeh, Ahmad Jrade

Abstract:

Energy assessment is an evidently significant factor when evaluating the sustainability of structures especially at the early design stage. Today design practices revolve around the selection of material that reduces the operational energy and yet meets their displinary need. Operational energy represents a substantial part of the building lifecycle energy usage but the fact remains that embodied energy is an important aspect unaccounted for in the carbon footprint. At the moment, little or no consideration is given to embodied energy mainly due to the complexity of calculation and the various factors involved. The equipment used, the fuel needed, and electricity required for each material vary with location and thus the embodied energy will differ for each project. Moreover, the method and the technique used in manufacturing, transporting and putting in place will have a significant influence on the materials’ embodied energy. This anomaly has made it difficult to calculate or even bench mark the usage of such energies. This paper presents a model aimed at helping designers select the construction materials based on their embodied energy. Moreover, this paper presents a systematic approach that uses an efficient method of calculation and ultimately provides new insight into construction material selection. The model is developed in a BIM environment targeting the quantification of embodied energy for construction materials through the three main stages of their life: manufacturing, transportation and placement. The model contains three major databases each of which contains a set of the most commonly used construction materials. The first dataset holds information about the energy required to manufacture any type of materials, the second includes information about the energy required for transporting the materials while the third stores information about the energy required by tools and cranes needed to place an item in its intended location. The model provides designers with sets of all available construction materials and their associated embodied energies to use for the selection during the design process. Through geospatial data and dimensional material analysis, the model will also be able to automatically calculate the distance between the factories and the construction site. To remain within the sustainability criteria set by LEED, a final database is created and used to calculate the overall construction cost based on R.M.S. means cost data and then automatically recalculate the costs for any modifications. Design criteria including both operational and embodied energies will cause designers to revaluate the current material selection for cost, energy, and most importantly sustainability.

Keywords: building information modelling, energy, life cycle analysis, sustainablity

Procedia PDF Downloads 269
1785 Using Gaussian Process in Wind Power Forecasting

Authors: Hacene Benkhoula, Mohamed Badreddine Benabdella, Hamid Bouzeboudja, Abderrahmane Asraoui

Abstract:

The wind is a random variable difficult to master, for this, we developed a mathematical and statistical methods enable to modeling and forecast wind power. Gaussian Processes (GP) is one of the most widely used families of stochastic processes for modeling dependent data observed over time, or space or time and space. GP is an underlying process formed by unrecognized operator’s uses to solve a problem. The purpose of this paper is to present how to forecast wind power by using the GP. The Gaussian process method for forecasting are presented. To validate the presented approach, a simulation under the MATLAB environment has been given.

Keywords: wind power, Gaussien process, modelling, forecasting

Procedia PDF Downloads 418