Search results for: data mining applications and discovery
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 30900

Search results for: data mining applications and discovery

26190 Investigation Of Eugan's, Optical Properties With Dft

Authors: Bahieddine. Bouabdellah, Benameur. Amiri, Abdelkader.nouri

Abstract:

Europium-doped gallium nitride (EuGaN) is a promising material for optoelectronic and thermoelectric devices. This study investigates its optical properties using density functional theory (DFT) with the FP-LAPW method and MBJ+U correction. The simulation substitutes a gallium atom with europium in a hexagonal GaN lattice (6% doping). Distinct absorption peaks are observed in the optical analysis. These results highlight EuGaN's potential for various applications and pave the way for further research on rare earth-doped materials.

Keywords: eugan, fp-lapw, dft, wien2k, mbj hubbard

Procedia PDF Downloads 76
26189 A Supervised Approach for Detection of Singleton Spam Reviews

Authors: Atefeh Heydari, Mohammadali Tavakoli, Naomie Salim

Abstract:

In recent years, we have witnessed that online reviews are the most important source of customers’ opinion. They are progressively more used by individuals and organisations to make purchase and business decisions. Unfortunately, for the reason of profit or fame, frauds produce deceptive reviews to hoodwink potential customers. Their activities mislead not only potential customers to make appropriate purchasing decisions and organisations to reshape their business, but also opinion mining techniques by preventing them from reaching accurate results. Spam reviews could be divided into two main groups, i.e. multiple and singleton spam reviews. Detecting a singleton spam review that is the only review written by a user ID is extremely challenging due to lack of clue for detection purposes. Singleton spam reviews are very harmful and various features and proofs used in multiple spam reviews detection are not applicable in this case. Current research aims to propose a novel supervised technique to detect singleton spam reviews. To achieve this, various features are proposed in this study and are to be combined with the most appropriate features extracted from literature and employed in a classifier. In order to compare the performance of different classifiers, SVM and naive Bayes classification algorithms were used for model building. The results revealed that SVM was more accurate than naive Bayes and our proposed technique is capable to detect singleton spam reviews effectively.

Keywords: classification algorithms, Naïve Bayes, opinion review spam detection, singleton review spam detection, support vector machine

Procedia PDF Downloads 312
26188 Study and Conservation of Cultural and Natural Heritages with the Use of Laser Scanner and Processing System for 3D Modeling Spatial Data

Authors: Julia Desiree Velastegui Caceres, Luis Alejandro Velastegui Caceres, Oswaldo Padilla, Eduardo Kirby, Francisco Guerrero, Theofilos Toulkeridis

Abstract:

It is fundamental to conserve sites of natural and cultural heritage with any available technique or existing methodology of preservation in order to sustain them for the following generations. We propose a further skill to protect the actual view of such sites, in which with high technology instrumentation we are able to digitally preserve natural and cultural heritages applied in Ecuador. In this project the use of laser technology is presented for three-dimensional models, with high accuracy in a relatively short period of time. In Ecuador so far, there are not any records on the use and processing of data obtained by this new technological trend. The importance of the project is the description of the methodology of the laser scanner system using the Faro Laser Scanner Focus 3D 120, the method for 3D modeling of geospatial data and the development of virtual environments in the areas of Cultural and Natural Heritage. In order to inform users this trend in technology in which three-dimensional models are generated, the use of such tools has been developed to be able to be displayed in all kinds of digitally formats. The results of the obtained 3D models allows to demonstrate that this technology is extremely useful in these areas, but also indicating that each data campaign needs an individual slightly different proceeding starting with the data capture and processing to obtain finally the chosen virtual environments.

Keywords: laser scanner system, 3D model, cultural heritage, natural heritage

Procedia PDF Downloads 312
26187 Marginalized Two-Part Joint Models for Generalized Gamma Family of Distributions

Authors: Mohadeseh Shojaei Shahrokhabadi, Ding-Geng (Din) Chen

Abstract:

Positive continuous outcomes with a substantial number of zero values and incomplete longitudinal follow-up are quite common in medical cost data. To jointly model semi-continuous longitudinal cost data and survival data and to provide marginalized covariate effect estimates, a marginalized two-part joint model (MTJM) has been developed for outcome variables with lognormal distributions. In this paper, we propose MTJM models for outcome variables from a generalized gamma (GG) family of distributions. The GG distribution constitutes a general family that includes approximately all of the most frequently used distributions like the Gamma, Exponential, Weibull, and Log Normal. In the proposed MTJM-GG model, the conditional mean from a conventional two-part model with a three-parameter GG distribution is parameterized to provide the marginal interpretation for regression coefficients. In addition, MTJM-gamma and MTJM-Weibull are developed as special cases of MTJM-GG. To illustrate the applicability of the MTJM-GG, we applied the model to a set of real electronic health record data recently collected in Iran, and we provided SAS code for application. The simulation results showed that when the outcome distribution is unknown or misspecified, which is usually the case in real data sets, the MTJM-GG consistently outperforms other models. The GG family of distribution facilitates estimating a model with improved fit over the MTJM-gamma, standard Weibull, or Log-Normal distributions.

Keywords: marginalized two-part model, zero-inflated, right-skewed, semi-continuous, generalized gamma

Procedia PDF Downloads 184
26186 Masked Candlestick Model: A Pre-Trained Model for Trading Prediction

Authors: Ling Qi, Matloob Khushi, Josiah Poon

Abstract:

This paper introduces a pre-trained Masked Candlestick Model (MCM) for trading time-series data. The pre-trained model is based on three core designs. First, we convert trading price data at each data point as a set of normalized elements and produce embeddings of each element. Second, we generate a masked sequence of such embedded elements as inputs for self-supervised learning. Third, we use the encoder mechanism from the transformer to train the inputs. The masked model learns the contextual relations among the sequence of embedded elements, which can aid downstream classification tasks. To evaluate the performance of the pre-trained model, we fine-tune MCM for three different downstream classification tasks to predict future price trends. The fine-tuned models achieved better accuracy rates for all three tasks than the baseline models. To better analyze the effectiveness of MCM, we test the same architecture for three currency pairs, namely EUR/GBP, AUD/USD, and EUR/JPY. The experimentation results demonstrate MCM’s effectiveness on all three currency pairs and indicate the MCM’s capability for signal extraction from trading data.

Keywords: masked language model, transformer, time series prediction, trading prediction, embedding, transfer learning, self-supervised learning

Procedia PDF Downloads 131
26185 Design of Traffic Counting Android Application with Database Management System and Its Comparative Analysis with Traditional Counting Methods

Authors: Muhammad Nouman, Fahad Tiwana, Muhammad Irfan, Mohsin Tiwana

Abstract:

Traffic congestion has been increasing significantly in major metropolitan areas as a result of increased motorization, urbanization, population growth and changes in the urban density. Traffic congestion compromises efficiency of transport infrastructure and causes multiple traffic concerns; including but not limited to increase of travel time, safety hazards, air pollution, and fuel consumption. Traffic management has become a serious challenge for federal and provincial governments, as well as exasperated commuters. Effective, flexible, efficient and user-friendly traffic information/database management systems characterize traffic conditions by making use of traffic counts for storage, processing, and visualization. While, the emerging data collection technologies continue to proliferate, its accuracy can be guaranteed through the comparison of observed data with the manual handheld counters. This paper presents the design of tablet based manual traffic counting application and framework for development of traffic database management system for Pakistan. The database management system comprises of three components including traffic counting android application; establishing online database and its visualization using Google maps. Oracle relational database was chosen to develop the data structure whereas structured query language (SQL) was adopted to program the system architecture. The GIS application links the data from the database and projects it onto a dynamic map for traffic conditions visualization. The traffic counting device and example of a database application in the real-world problem provided a creative outlet to visualize the uses and advantages of a database management system in real time. Also, traffic data counts by means of handheld tablet/ mobile application can be used for transportation planning and forecasting.

Keywords: manual count, emerging data sources, traffic information quality, traffic surveillance, traffic counting device, android; data visualization, traffic management

Procedia PDF Downloads 201
26184 A Performance Study of Fixed, Single-Axis and Dual-Axis Photovoltaic Systems in Kuwait

Authors: A. Al-Rashidi, A. El-Hamalawi

Abstract:

In this paper, a performance study was conducted to investigate single and dual-axis PV systems to generate electricity in five different sites in Kuwait. Relevant data were obtained by using two sources for validation purposes. A commercial software, PVsyst, was used to analyse the data, such as metrological data and other input parameters, and compute the performance parameters such as capacity factor (CF) and final yield (YF). The results indicated that single and dual-axis PV systems would be very beneficial to electricity generation in Kuwait as an alternative source to conventional power plants, especially with the increased demand over time. The ranges were also found to be competitive in comparison to leading countries using similar systems. A significant increase in CF and YF values around 24% and 28.8% was achieved related to the use of single and dual systems, respectively.

Keywords: single-axis and dual-axis photovoltaic systems, capacity factor, final yield, Kuwait

Procedia PDF Downloads 298
26183 Infrared Lightbox and iPhone App for Improving Detection Limit of Phosphate Detecting Dip Strips

Authors: H. Heidari-Bafroui, B. Ribeiro, A. Charbaji, C. Anagnostopoulos, M. Faghri

Abstract:

In this paper, we report the development of a portable and inexpensive infrared lightbox for improving the detection limits of paper-based phosphate devices. Commercial paper-based devices utilize the molybdenum blue protocol to detect phosphate in the environment. Although these devices are easy to use and have a long shelf life, their main deficiency is their low sensitivity based on the qualitative results obtained via a color chart. To improve the results, we constructed a compact infrared lightbox that communicates wirelessly with a smartphone. The system measures the absorbance of radiation for the molybdenum blue reaction in the infrared region of the spectrum. It consists of a lightbox illuminated by four infrared light-emitting diodes, an infrared digital camera, a Raspberry Pi microcontroller, a mini-router, and an iPhone to control the microcontroller. An iPhone application was also developed to analyze images captured by the infrared camera in order to quantify phosphate concentrations. Additionally, the app connects to an online data center to present a highly scalable worldwide system for tracking and analyzing field measurements. In this study, the detection limits for two popular commercial devices were improved by a factor of 4 for the Quantofix devices (from 1.3 ppm using visible light to 300 ppb using infrared illumination) and a factor of 6 for the Indigo units (from 9.2 ppm to 1.4 ppm) with repeatability of less than or equal to 1.2% relative standard deviation (RSD). The system also provides more granular concentration information compared to the discrete color chart used by commercial devices and it can be easily adapted for use in other applications.

Keywords: infrared lightbox, paper-based device, phosphate detection, smartphone colorimetric analyzer

Procedia PDF Downloads 126
26182 Evaluating the Accuracy of Biologically Relevant Variables Generated by ClimateAP

Authors: Jing Jiang, Wenhuan XU, Lei Zhang, Shiyi Zhang, Tongli Wang

Abstract:

Climate data quality significantly affects the reliability of ecological modeling. In the Asia Pacific (AP) region, low-quality climate data hinders ecological modeling. ClimateAP, a software developed in 2017, generates high-quality climate data for the AP region, benefiting researchers in forestry and agriculture. However, its adoption remains limited. This study aims to confirm the validity of biologically relevant variable data generated by ClimateAP during the normal climate period through comparison with the currently available gridded data. Climate data from 2,366 weather stations were used to evaluate the prediction accuracy of ClimateAP in comparison with the commonly used gridded data from WorldClim1.4. Univariate regressions were applied to 48 monthly biologically relevant variables, and the relationship between the observational data and the predictions made by ClimateAP and WorldClim was evaluated using Adjusted R-Squared and Root Mean Squared Error (RMSE). Locations were categorized into mountainous and flat landforms, considering elevation, slope, ruggedness, and Topographic Position Index. Univariate regressions were then applied to all biologically relevant variables for each landform category. Random Forest (RF) models were implemented for the climatic niche modeling of Cunninghamia lanceolata. A comparative analysis of the prediction accuracies of RF models constructed with distinct climate data sources was conducted to evaluate their relative effectiveness. Biologically relevant variables were obtained from three unpublished Chinese meteorological datasets. ClimateAPv3.0 and WorldClim predictions were obtained from weather station coordinates and WorldClim1.4 rasters, respectively, for the normal climate period of 1961-1990. Occurrence data for Cunninghamia lanceolata came from integrated biodiversity databases with 3,745 unique points. ClimateAP explains a minimum of 94.74%, 97.77%, 96.89%, and 94.40% of monthly maximum, minimum, average temperature, and precipitation variances, respectively. It outperforms WorldClim in 37 biologically relevant variables with lower RMSE values. ClimateAP achieves higher R-squared values for the 12 monthly minimum temperature variables and consistently higher Adjusted R-squared values across all landforms for precipitation. ClimateAP's temperature data yields lower Adjusted R-squared values than gridded data in high-elevation, rugged, and mountainous areas but achieves higher values in mid-slope drainages, plains, open slopes, and upper slopes. Using ClimateAP improves the prediction accuracy of tree occurrence from 77.90% to 82.77%. The biologically relevant climate data produced by ClimateAP is validated based on evaluations using observations from weather stations. The use of ClimateAP leads to an improvement in data quality, especially in non-mountainous regions. The results also suggest that using biologically relevant variables generated by ClimateAP can slightly enhance climatic niche modeling for tree species, offering a better understanding of tree species adaptation and resilience compared to using gridded data.

Keywords: climate data validation, data quality, Asia pacific climate, climatic niche modeling, random forest models, tree species

Procedia PDF Downloads 69
26181 Eye Tracking: Biometric Evaluations of Instructional Materials for Improved Learning

Authors: Janet Holland

Abstract:

Eye tracking is a great way to triangulate multiple data sources for deeper, more complete knowledge of how instructional materials are really being used and emotional connections made. Using sensor based biometrics provides a detailed local analysis in real time expanding our ability to collect science based data for a more comprehensive level of understanding, not previously possible, for teaching and learning. The knowledge gained will be used to make future improvements to instructional materials, tools, and interactions. The literature has been examined and a preliminary pilot test was implemented to develop a methodology for research in Instructional Design and Technology. Eye tracking now offers the addition of objective metrics obtained from eye tracking and other biometric data collection with analysis for a fresh perspective.

Keywords: area of interest, eye tracking, biometrics, fixation, fixation count, fixation sequence, fixation time, gaze points, heat map, saccades, time to first fixation

Procedia PDF Downloads 136
26180 A Proposed Mechanism for Skewing Symmetric Distributions

Authors: M. T. Alodat

Abstract:

In this paper, we propose a mechanism for skewing any symmetric distribution. The new distribution is called the deflation-inflation distribution (DID). We discuss some statistical properties of the DID such moments, stochastic representation, log-concavity. Also we fit the distribution to real data and we compare it to normal distribution and Azzlaini's skew normal distribution. Numerical results show that the DID fits the the tree ring data better than the other two distributions.

Keywords: normal distribution, moments, Fisher information, symmetric distributions

Procedia PDF Downloads 661
26179 Polarimetric Synthetic Aperture Radar Data Classification Using Support Vector Machine and Mahalanobis Distance

Authors: Najoua El Hajjaji El Idrissi, Necip Gokhan Kasapoglu

Abstract:

Polarimetric Synthetic Aperture Radar-based imaging is a powerful technique used for earth observation and classification of surfaces. Forest evolution has been one of the vital areas of attention for the remote sensing experts. The information about forest areas can be achieved by remote sensing, whether by using active radars or optical instruments. However, due to several weather constraints, such as cloud cover, limited information can be recovered using optical data and for that reason, Polarimetric Synthetic Aperture Radar (PolSAR) is used as a powerful tool for forestry inventory. In this [14paper, we applied support vector machine (SVM) and Mahalanobis distance to the fully polarimetric AIRSAR P, L, C-bands data from the Nezer forest areas, the classification is based in the separation of different tree ages. The classification results were evaluated and the results show that the SVM performs better than the Mahalanobis distance and SVM achieves approximately 75% accuracy. This result proves that SVM classification can be used as a useful method to evaluate fully polarimetric SAR data with sufficient value of accuracy.

Keywords: classification, synthetic aperture radar, SAR polarimetry, support vector machine, mahalanobis distance

Procedia PDF Downloads 136
26178 Exploring the Potential of Phase Change Materials in Construction Environments

Authors: A. Ait Ahsene F., B. Boughrara S.

Abstract:

The buildings sector accounts for a significant portion of global energy consumption, with much of this energy used to heat and cool indoor spaces. In this context, the integration of innovative technologies such as phase change materials (PCM) holds promising potential to improve the energy efficiency and thermal comfort of buildings. This research topic explores the benefits and challenges associated with the use of PCMs in buildings, focusing on their ability to store and release thermal energy to regulate indoor temperature. We investigated the different types of PCM available, their thermal properties, and their potential applications in various climate zones and building types. To evaluate and compare the performance of PCMs, our methodology includes a series of laboratory and field experiments. In the laboratory, we measure the thermal storage capacity, melting and solidification temperatures, latent heat, and thermal conductivity of various PCMs. These measurements make it possible to quantify the capacity of each PCM to store and release thermal energy, as well as its capacity to transfer this energy through the construction materials. Additionally, field studies are conducted to evaluate the performance of PCMs in real-world environments. We install PCM systems in real buildings and monitor their operation over time, measuring energy savings, occupant thermal comfort, and material durability. These empirical data allow us to compare the effectiveness of different types of PCMs under real-world use conditions. By combining the results of laboratory and field experiments, we provide a comprehensive analysis of the advantages and limitations of PCMs in buildings, as well as recommendations for their effective application in practice.

Keywords: energy saving, phase change materials, material sustainability, buildings sector

Procedia PDF Downloads 46
26177 Short Life Cycle Time Series Forecasting

Authors: Shalaka Kadam, Dinesh Apte, Sagar Mainkar

Abstract:

The life cycle of products is becoming shorter and shorter due to increased competition in market, shorter product development time and increased product diversity. Short life cycles are normal in retail industry, style business, entertainment media, and telecom and semiconductor industry. The subject of accurate forecasting for demand of short lifecycle products is of special enthusiasm for many researchers and organizations. Due to short life cycle of products the amount of historical data that is available for forecasting is very minimal or even absent when new or modified products are launched in market. The companies dealing with such products want to increase the accuracy in demand forecasting so that they can utilize the full potential of the market at the same time do not oversupply. This provides the challenge to develop a forecasting model that can forecast accurately while handling large variations in data and consider the complex relationships between various parameters of data. Many statistical models have been proposed in literature for forecasting time series data. Traditional time series forecasting models do not work well for short life cycles due to lack of historical data. Also artificial neural networks (ANN) models are very time consuming to perform forecasting. We have studied the existing models that are used for forecasting and their limitations. This work proposes an effective and powerful forecasting approach for short life cycle time series forecasting. We have proposed an approach which takes into consideration different scenarios related to data availability for short lifecycle products. We then suggest a methodology which combines statistical analysis with structured judgement. Also the defined approach can be applied across domains. We then describe the method of creating a profile from analogous products. This profile can then be used for forecasting products with historical data of analogous products. We have designed an application which combines data, analytics and domain knowledge using point-and-click technology. The forecasting results generated are compared using MAPE, MSE and RMSE error scores. Conclusion: Based on the results it is observed that no one approach is sufficient for short life-cycle forecasting and we need to combine two or more approaches for achieving the desired accuracy.

Keywords: forecast, short life cycle product, structured judgement, time series

Procedia PDF Downloads 363
26176 The Effectiveness of Multiphase Flow in Well- Control Operations

Authors: Ahmed Borg, Elsa Aristodemou, Attia Attia

Abstract:

Well control involves managing the circulating drilling fluid within the wells and avoiding kicks and blowouts as these can lead to losses in human life and drilling facilities. Current practices for good control incorporate predictions of pressure losses through computational models. Developing a realistic hydraulic model for a good control problem is a very complicated process due to the existence of a complex multiphase region, which usually contains a non-Newtonian drilling fluid and the miscibility of formation gas in drilling fluid. The current approaches assume an inaccurate flow fluid model within the well, which leads to incorrect pressure loss calculations. To overcome this problem, researchers have been considering the more complex two-phase fluid flow models. However, even these more sophisticated two-phase models are unsuitable for applications where pressure dynamics are important, such as in managed pressure drilling. This study aims to develop and implement new fluid flow models that take into consideration the miscibility of fluids as well as their non-Newtonian properties for enabling realistic kick treatment. furthermore, a corresponding numerical solution method is built with an enriched data bank. The research work considers and implements models that take into consideration the effect of two phases in kick treatment for well control in conventional drilling. In this work, a corresponding numerical solution method is built with an enriched data bank. Software STARCCM+ for the computational studies to study the important parameters to describe wellbore multiphase flow, the mass flow rate, volumetric fraction, and velocity of each phase. Results showed that based on the analysis of these simulation studies, a coarser full-scale model of the wellbore, including chemical modeling established. The focus of the investigations was put on the near drill bit section. This inflow area shows certain characteristics that are dominated by the inflow conditions of the gas as well as by the configuration of the mud stream entering the annulus. Without considering the gas solubility effect, the bottom hole pressure could be underestimated by 4.2%, while the bottom hole temperature is overestimated by 3.2%. and without considering the heat transfer effect, the bottom hole pressure could be overestimated by 11.4% under steady flow conditions. Besides, larger reservoir pressure leads to a larger gas fraction in the wellbore. However, reservoir pressure has a minor effect on the steady wellbore temperature. Also as choke pressure increases, less gas will exist in the annulus in the form of free gas.

Keywords: multiphase flow, well- control, STARCCM+, petroleum engineering and gas technology, computational fluid dynamic

Procedia PDF Downloads 122
26175 Comparing Machine Learning Estimation of Fuel Consumption of Heavy-Duty Vehicles

Authors: Victor Bodell, Lukas Ekstrom, Somayeh Aghanavesi

Abstract:

Fuel consumption (FC) is one of the key factors in determining expenses of operating a heavy-duty vehicle. A customer may therefore request an estimate of the FC of a desired vehicle. The modular design of heavy-duty vehicles allows their construction by specifying the building blocks, such as gear box, engine and chassis type. If the combination of building blocks is unprecedented, it is unfeasible to measure the FC, since this would first r equire the construction of the vehicle. This paper proposes a machine learning approach to predict FC. This study uses around 40,000 vehicles specific and o perational e nvironmental c onditions i nformation, such as road slopes and driver profiles. A ll v ehicles h ave d iesel engines and a mileage of more than 20,000 km. The data is used to investigate the accuracy of machine learning algorithms Linear regression (LR), K-nearest neighbor (KNN) and Artificial n eural n etworks (ANN) in predicting fuel consumption for heavy-duty vehicles. Performance of the algorithms is evaluated by reporting the prediction error on both simulated data and operational measurements. The performance of the algorithms is compared using nested cross-validation and statistical hypothesis testing. The statistical evaluation procedure finds that ANNs have the lowest prediction error compared to LR and KNN in estimating fuel consumption on both simulated and operational data. The models have a mean relative prediction error of 0.3% on simulated data, and 4.2% on operational data.

Keywords: artificial neural networks, fuel consumption, friedman test, machine learning, statistical hypothesis testing

Procedia PDF Downloads 186
26174 Design and Implement a Remote Control Robot Controlled by Zigbee Wireless Network

Authors: Sinan Alsaadi, Mustafa Merdan

Abstract:

Communication and access systems can be made with many methods in today’s world. These systems are such standards as Wifi, Wimax, Bluetooth, GPS and GPRS. Devices which use these standards also use system resources excessively in direct proportion to their transmission speed. However, large-scale data communication is not always needed. In such cases, a technology which will use system resources as little as possible and support smart network topologies has been needed in order to enable the transmissions of such small packet data and provide the control for this kind of devices. IEEE issued 802.15.4 standard upon this necessity and enabled the production of Zigbee protocol which takes these standards as its basis and devices which support this protocol. In our project, this communication protocol was preferred. The aim of this study is to provide the immediate data transmission of our robot from the field within the scope of the project. In addition, making the communication with the robot through Zigbee Protocol has also been aimed. While sitting on the computer, obtaining the desired data from the region where the robot is located has been taken as the basis. Arduino Uno R3 microcontroller which provides the control mechanism, 1298 shield as the motor driver.

Keywords: ZigBee, wireless network, remote monitoring, smart home, agricultural industry

Procedia PDF Downloads 281
26173 Characterization of Aluminosilicates and Verification of Their Impact on Quality of Ceramic Proppants Intended for Shale Gas Output

Authors: Joanna Szymanska, Paulina Wawulska-Marek, Jaroslaw Mizera

Abstract:

Nowadays, the rapid growth of global energy consumption and uncontrolled depletion of natural resources become a serious problem. Shale rocks are the largest and potential global basins containing hydrocarbons, trapped in closed pores of the shale matrix. Regardless of the shales origin, mining conditions are extremely unfavourable due to high reservoir pressure, great depths, increased clay minerals content and limited permeability (nanoDarcy) of the rocks. Taking into consideration such geomechanical barriers, effective extraction of natural gas from shales with plastic zones demands effective operations. Actually, hydraulic fracturing is the most developed technique based on the injection of pressurized fluid into a wellbore, to initiate fractures propagation. However, a rapid drop of pressure after fluid suction to the ground induces a fracture closure and conductivity reduction. In order to minimize this risk, proppants should be applied. They are solid granules transported with hydraulic fluids to locate inside the rock. Proppants act as a prop for the closing fracture, thus gas migration to a borehole is effective. Quartz sands are commonly applied proppants only at shallow deposits (USA). Whereas, ceramic proppants are designed to meet rigorous downhole conditions to intensify output. Ceramic granules predominate with higher mechanical strength, stability in strong acidic environment, spherical shape and homogeneity as well. Quality of ceramic proppants is conditioned by raw materials selection. Aim of this study was to obtain the proppants from aluminosilicates (the kaolinite subgroup) and mix of minerals with a high alumina content. These loamy minerals contain a tubular and platy morphology that improves mechanical properties and reduces their specific weight. Moreover, they are distinguished by well-developed surface area, high porosity, fine particle size, superb dispersion and nontoxic properties - very crucial for particles consolidation into spherical and crush-resistant granules in mechanical granulation process. The aluminosilicates were mixed with water and natural organic binder to improve liquid-bridges and pores formation between particles. Afterward, the green proppants were subjected to sintering at high temperatures. Evaluation of the minerals utility was based on their particle size distribution (laser diffraction study) and thermal stability (thermogravimetry). Scanning Electron Microscopy was useful for morphology and shape identification combined with specific surface area measurement (BET). Chemical composition was verified by Energy Dispersive Spectroscopy and X-ray Fluorescence. Moreover, bulk density and specific weight were measured. Such comprehensive characterization of loamy materials confirmed their favourable impact on the proppants granulation. The sintered granules were analyzed by SEM to verify the surface topography and phase transitions after sintering. Pores distribution was identified by X-Ray Tomography. This method enabled also the simulation of proppants settlement in a fracture, while measurement of bulk density was essential to predict their amount to fill a well. Roundness coefficient was also evaluated, whereas impact on mining environment was identified by turbidity and solubility in acid - to indicate risk of the material decay in a well. The obtained outcomes confirmed a positive influence of the loamy minerals on ceramic proppants properties with respect to the strict norms. This research is perspective for higher quality proppants production with costs reduction.

Keywords: aluminosilicates, ceramic proppants, mechanical granulation, shale gas

Procedia PDF Downloads 165
26172 Development of a System for Measuring the Three-axis Pedal Force in Cycling and Its Applications

Authors: Joo-Hack Lee, Jin-Seung Choi, Dong-Won Kang, Jeong-Woo Seo, Ju-Young Kim, Dae-Hyeok Kim, Seung-Tae Yang, Gye-Rae Tack

Abstract:

For cycling, the analysis of the pedal force is one of the important factors in the study of exercise ability assessment and overuse injuries. In past studies, a two-axis measurement sensor was used at the sagittal plane to measure the force only in the anterior, posterior, and vertical directions and to analyze the loss of force and the injury on the frontal plane due to the forces in the right and left directions. In this study, which is a basic study on diverse analyses of the pedal force that consider the forces on the sagittal plane and the frontal plane, a three-axis pedal force measurement sensor was developed to measure the anterior-posterior (Fx), medio-lateral (Fz), and vertical (Fy) forces. The sensor was fabricated with a size and shape similar to those of the general flat pedal, and had a 550g weight that allowed smooth pedaling. Its measurement range was ±1000 N for Fx and Fz and ±2000 N for Fy, and its non-linearity, hysteresis, and repeatability were approximately 0.5%. The data were sampled at 1000 Hz using a signal collector. To use the developed sensor, the pedaling efficiency (index of efficiency, IE) and the range of left and right (medio-lateral, ML) forces were measured with two seat heights (low and high). The results of the measurement showed that the IE was higher and the force range in the ML direction was lower with the high position than with the low position. The developed measurement sensor and its application results will be useful in understanding and explaining the complicated pedaling technique, and will enable diverse kinematic analyses of the pedal force on the sagittal plane and the frontal plane.

Keywords: cycling, pedal force, index of effectiveness, measuring

Procedia PDF Downloads 667
26171 Urban Noise and Air Quality: Correlation between Air and Noise Pollution; Sensors, Data Collection, Analysis and Mapping in Urban Planning

Authors: Massimiliano Condotta, Paolo Ruggeri, Chiara Scanagatta, Giovanni Borga

Abstract:

Architects and urban planners, when designing and renewing cities, have to face a complex set of problems, including the issues of noise and air pollution which are considered as hot topics (i.e., the Clean Air Act of London and the Soundscape definition). It is usually taken for granted that these problems go by together because the noise pollution present in cities is often linked to traffic and industries, and these produce air pollutants as well. Traffic congestion can create both noise pollution and air pollution, because NO₂ is mostly created from the oxidation of NO, and these two are notoriously produced by processes of combustion at high temperatures (i.e., car engines or thermal power stations). We can see the same process for industrial plants as well. What have to be investigated – and is the topic of this paper – is whether or not there really is a correlation between noise pollution and air pollution (taking into account NO₂) in urban areas. To evaluate if there is a correlation, some low-cost methodologies will be used. For noise measurements, the OpeNoise App will be installed on an Android phone. The smartphone will be positioned inside a waterproof box, to stay outdoor, with an external battery to allow it to collect data continuously. The box will have a small hole to install an external microphone, connected to the smartphone, which will be calibrated to collect the most accurate data. For air, pollution measurements will be used the AirMonitor device, an Arduino board to which the sensors, and all the other components, are plugged. After assembling the sensors, they will be coupled (one noise and one air sensor) and placed in different critical locations in the area of Mestre (Venice) to map the existing situation. The sensors will collect data for a fixed period of time to have an input for both week and weekend days, in this way it will be possible to see the changes of the situation during the week. The novelty is that data will be compared to check if there is a correlation between the two pollutants using graphs that should show the percentage of pollution instead of the values obtained with the sensors. To do so, the data will be converted to fit on a scale that goes up to 100% and will be shown thru a mapping of the measurement using GIS methods. Another relevant aspect is that this comparison can help to choose which are the right mitigation solutions to be applied in the area of the analysis because it will make it possible to solve both the noise and the air pollution problem making only one intervention. The mitigation solutions must consider not only the health aspect but also how to create a more livable space for citizens. The paper will describe in detail the methodology and the technical solution adopted for the realization of the sensors, the data collection, noise and pollution mapping and analysis.

Keywords: air quality, data analysis, data collection, NO₂, noise mapping, noise pollution, particulate matter

Procedia PDF Downloads 214
26170 A Safety Analysis Method for Multi-Agent Systems

Authors: Ching Louis Liu, Edmund Kazmierczak, Tim Miller

Abstract:

Safety analysis for multi-agent systems is complicated by the, potentially nonlinear, interactions between agents. This paper proposes a method for analyzing the safety of multi-agent systems by explicitly focusing on interactions and the accident data of systems that are similar in structure and function to the system being analyzed. The method creates a Bayesian network using the accident data from similar systems. A feature of our method is that the events in accident data are labeled with HAZOP guide words. Our method uses an Ontology to abstract away from the details of a multi-agent implementation. Using the ontology, our methods then constructs an “Interaction Map,” a graphical representation of the patterns of interactions between agents and other artifacts. Interaction maps combined with statistical data from accidents and the HAZOP classifications of events can be converted into a Bayesian Network. Bayesian networks allow designers to explore “what it” scenarios and make design trade-offs that maintain safety. We show how to use the Bayesian networks, and the interaction maps to improve multi-agent system designs.

Keywords: multi-agent system, safety analysis, safety model, integration map

Procedia PDF Downloads 422
26169 Fast Approximate Bayesian Contextual Cold Start Learning (FAB-COST)

Authors: Jack R. McKenzie, Peter A. Appleby, Thomas House, Neil Walton

Abstract:

Cold-start is a notoriously difficult problem which can occur in recommendation systems, and arises when there is insufficient information to draw inferences for users or items. To address this challenge, a contextual bandit algorithm – the Fast Approximate Bayesian Contextual Cold Start Learning algorithm (FAB-COST) – is proposed, which is designed to provide improved accuracy compared to the traditionally used Laplace approximation in the logistic contextual bandit, while controlling both algorithmic complexity and computational cost. To this end, FAB-COST uses a combination of two moment projection variational methods: Expectation Propagation (EP), which performs well at the cold start, but becomes slow as the amount of data increases; and Assumed Density Filtering (ADF), which has slower growth of computational cost with data size but requires more data to obtain an acceptable level of accuracy. By switching from EP to ADF when the dataset becomes large, it is able to exploit their complementary strengths. The empirical justification for FAB-COST is presented, and systematically compared to other approaches on simulated data. In a benchmark against the Laplace approximation on real data consisting of over 670, 000 impressions from autotrader.co.uk, FAB-COST demonstrates at one point increase of over 16% in user clicks. On the basis of these results, it is argued that FAB-COST is likely to be an attractive approach to cold-start recommendation systems in a variety of contexts.

Keywords: cold-start learning, expectation propagation, multi-armed bandits, Thompson Sampling, variational inference

Procedia PDF Downloads 111
26168 Efficient Synthesis of Thiourea Based Iminothiazoline Heterocycles

Authors: Hummera Rafique, Aamer Saeed

Abstract:

Thioureas are highly biologically active compounds, as many important applications are associated with this nucleus. They serve as exceptionally versatile building block for the synthesis of wide variety of heterocyclic systems, which also possess extensive range of bioactivities. These thioureas were converted into five-membered heterocycles with imino moiety like ethyl 4-[2-benzamido-4-methylthiazol-3(2H)-yl)]benzoates (2a-j) by base catalyzed cyclization of corresponding thioureas with 2-bromoacetone and triethylamine in good yields.

Keywords: ethyl 4-[2-benzamido-4-methylthiazol-3(2H)-yl)]benzoates, ethyl 4-(3-benzoylthioureido) benzoates, antibacterial activity

Procedia PDF Downloads 358
26167 Epilepsy Seizure Prediction by Effective Connectivity Estimation Using Granger Causality and Directed Transfer Function Analysis of Multi-Channel Electroencephalogram

Authors: Mona Hejazi, Ali Motie Nasrabadi

Abstract:

Epilepsy is a persistent neurological disorder that affects more than 50 million people worldwide. Hence, there is a necessity to introduce an efficient prediction model for making a correct diagnosis of the epileptic seizure and accurate prediction of its type. In this study we consider how the Effective Connectivity (EC) patterns obtained from intracranial Electroencephalographic (EEG) recordings reveal information about the dynamics of the epileptic brain and can be used to predict imminent seizures, as this will enable the patients (and caregivers) to take appropriate precautions. We use this definition because we believe that effective connectivity near seizures begin to change, so we can predict seizures according to this feature. Results are reported on the standard Freiburg EEG dataset which contains data from 21 patients suffering from medically intractable focal epilepsy. Six channels of EEG from each patients are considered and effective connectivity using Directed Transfer Function (DTF) and Granger Causality (GC) methods is estimated. We concentrate on effective connectivity standard deviation over time and feature changes in five brain frequency sub-bands (Alpha, Beta, Theta, Delta, and Gamma) are compared. The performance obtained for the proposed scheme in predicting seizures is: average prediction time is 50 minutes before seizure onset, the maximum sensitivity is approximate ~80% and the false positive rate is 0.33 FP/h. DTF method is more acceptable to predict epileptic seizures and generally we can observe that the greater results are in gamma and beta sub-bands. The research of this paper is significantly helpful for clinical applications, especially for the exploitation of online portable devices.

Keywords: effective connectivity, Granger causality, directed transfer function, epilepsy seizure prediction, EEG

Procedia PDF Downloads 472
26166 A Discrete Element Method Centrifuge Model of Monopile under Cyclic Lateral Loads

Authors: Nuo Duan, Yi Pik Cheng

Abstract:

This paper presents the data of a series of two-dimensional Discrete Element Method (DEM) simulations of a large-diameter rigid monopile subjected to cyclic loading under a high gravitational force. At present, monopile foundations are widely used to support the tall and heavy wind turbines, which are also subjected to significant from wind and wave actions. A safe design must address issues such as rotations and changes in soil stiffness subject to these loadings conditions. Design guidance on the issue is limited, so are the availability of laboratory and field test data. The interpretation of these results in sand, such as the relation between loading and displacement, relies mainly on empirical correlations to pile properties. Regarding numerical models, most data from Finite Element Method (FEM) can be found. They are not comprehensive, and most of the FEM results are sensitive to input parameters. The micro scale behaviour could change the mechanism of the soil-structure interaction. A DEM model was used in this paper to study the cyclic lateral loads behaviour. A non-dimensional framework is presented and applied to interpret the simulation results. The DEM data compares well with various set of published experimental centrifuge model test data in terms of lateral deflection. The accumulated permanent pile lateral displacements induced by the cyclic lateral loads were found to be dependent on the characteristics of the applied cyclic load, such as the extent of the loading magnitudes and directions.

Keywords: cyclic loading, DEM, numerical modelling, sands

Procedia PDF Downloads 323
26165 Novel Nickel Complex Compound Reactivates the Apoptotic Network, Cell Cycle Arrest and Cytoskeletal Rearrangement in Human Colon and Breast Cancer Cells

Authors: Nima Samie, Batoul Sadat Haerian, Sekaran Muniandy, M. S. Kanthimathi

Abstract:

Colon and breast cancers are categorized as the most prevalent types of cancer worldwide. Recently, the broad clinical application of metal complex compounds has led to the discovery of potential therapeutic drugs. The aim of this study was to evaluate the cytotoxic action of a selected nickel complex compound (NCC) against human colon and breast cancer cells. In this context, we determined the potency of the compound in the induction of apoptosis, cell cycle arrest, and cytoskeleton rearrangement. HT-29, WiDr, CCD-18Co, MCF-7 and Hs 190.T cell lines were used to determine the IC50 of the compound using the MTT assay. Analysis of apoptosis was carried out using immunofluorescence, acridine orange/ propidium iodide double staining, Annexin-V-FITC assay, evaluation of the translocation of NF-kB, oxygen radical antioxidant capacity, quenching of reactive oxygen species content , measurement of LDH release, caspase-3/-7, -8 and -9 assays and western blotting. The cell cycle arrest was examined using flowcytometry and gene expression was assessed using qPCR array. Results showed that our nickel complex compound displayed a potent suppressive effect on HT-29, WiDr, MCF-7 and Hs 190.T after 24 h of treatment with IC50 value of 2.02±0.54, 2.13±0.65, 3.76±015 and 3.14±0.45 µM respectively. This cytotoxic effect on normal cells was insignificant. Dipping in the mitochondrial membrane potential and increased release of cytochrome c from the mitochondria indicated induction of the intrinsic apoptosis pathway by the nickel complex compound. Activation of this pathway was further evidenced by significant activation of caspase 9 and 3/7.The nickel complex compound (NCC) was also shown activate the extrinsic pathways of apoptosis by activation of caspase-8 which is linked to the suppression of NF-kB translocation to the nucleus. Cell cycle arrest in the G1 phase and up-regulation of glutathione reductase, based on excessive ROS production were also observed. The results of this study suggest that the nickel complex compound is a potent anti-cancer agent inducing both intrinsic and extrinsic pathways as well as cell cycle arrest in colon and breast cancer cells.

Keywords: nickel complex, apoptosis, cytoskeletal rearrangement, colon cancer, breast cancer

Procedia PDF Downloads 317
26164 Estimation of Desktop E-Wastes in Delhi Using Multivariate Flow Analysis

Authors: Sumay Bhojwani, Ashutosh Chandra, Mamita Devaburman, Akriti Bhogal

Abstract:

This article uses the Material flow analysis for estimating e-wastes in the Delhi/NCR region. The Material flow analysis is based on sales data obtained from various sources. Much of the data available for the sales is unreliable because of the existence of a huge informal sector. The informal sector in India accounts for more than 90%. Therefore, the scope of this study is only limited to the formal one. Also, for projection of the sales data till 2030, we have used regression (linear) to avoid complexity. The actual sales in the years following 2015 may vary non-linearly but we have assumed a basic linear relation. The purpose of this study was to know an approximate quantity of desktop e-wastes that we will have by the year 2030 so that we start preparing ourselves for the ineluctable investment in the treatment of these ever-rising e-wastes. The results of this study can be used to install a treatment plant for e-wastes in Delhi.

Keywords: e-wastes, Delhi, desktops, estimation

Procedia PDF Downloads 262
26163 Distributed Processing for Content Based Lecture Video Retrieval on Hadoop Framework

Authors: U. S. N. Raju, Kothuri Sai Kiran, Meena G. Kamal, Vinay Nikhil Pabba, Suresh Kanaparthi

Abstract:

There is huge amount of lecture video data available for public use, and many more lecture videos are being created and uploaded every day. Searching for videos on required topics from this huge database is a challenging task. Therefore, an efficient method for video retrieval is needed. An approach for automated video indexing and video search in large lecture video archives is presented. As the amount of video lecture data is huge, it is very inefficient to do the processing in a centralized computation framework. Hence, Hadoop Framework for distributed computing for Big Video Data is used. First, step in the process is automatic video segmentation and key-frame detection to offer a visual guideline for the video content navigation. In the next step, we extract textual metadata by applying video Optical Character Recognition (OCR) technology on key-frames. The OCR and detected slide text line types are adopted for keyword extraction, by which both video- and segment-level keywords are extracted for content-based video browsing and search. The performance of the indexing process can be improved for a large database by using distributed computing on Hadoop framework.

Keywords: video lectures, big video data, video retrieval, hadoop

Procedia PDF Downloads 538
26162 A Critical Analysis of Environmental Investment in India

Authors: K. Y. Chen, H. Chua, C. W. Kan

Abstract:

Environmental investment is an important issue in many countries. In this study, we will first review the environmental issues related to India and their effect on the economical development. Secondly, economic data would be collected from government yearly statistics. The statistics would also include the environmental investment information of India. Finally, we would co-relate the data in order to find out the relationship between environmental investment and sustainable development in India. Therefore, in the paper, we aim to analyse the effect of an environmental investment on the sustainable development in India. Based on the economic data collected, India is in development status with fast population and GDP growth speed. India is facing the environment problems due to its high-speed development. However, the environment investment could give a positive impact on the sustainable development in India. The environmental investment is keeping in the same growth rate with GDP. Acknowledgment: Authors would like to thank the financial support from the Hong Kong Polytechnic University for this work.

Keywords: India, environmental investment, sustainable development, analysis

Procedia PDF Downloads 319
26161 A Robust System for Foot Arch Type Classification from Static Foot Pressure Distribution Data Using Linear Discriminant Analysis

Authors: R. Periyasamy, Deepak Joshi, Sneh Anand

Abstract:

Foot posture assessment is important to evaluate foot type, causing gait and postural defects in all age groups. Although different methods are used for classification of foot arch type in clinical/research examination, there is no clear approach for selecting the most appropriate measurement system. Therefore, the aim of this study was to develop a system for evaluation of foot type as clinical decision-making aids for diagnosis of flat and normal arch based on the Arch Index (AI) and foot pressure distribution parameter - Power Ratio (PR) data. The accuracy of the system was evaluated for 27 subjects with age ranging from 24 to 65 years. Foot area measurements (hind foot, mid foot, and forefoot) were acquired simultaneously from foot pressure intensity image using portable PedoPowerGraph system and analysis of the image in frequency domain to obtain foot pressure distribution parameter - PR data. From our results, we obtain 100% classification accuracy of normal and flat foot by using the linear discriminant analysis method. We observe there is no misclassification of foot types because of incorporating foot pressure distribution data instead of only arch index (AI). We found that the mid-foot pressure distribution ratio data and arch index (AI) value are well correlated to foot arch type based on visual analysis. Therefore, this paper suggests that the proposed system is accurate and easy to determine foot arch type from arch index (AI), as well as incorporating mid-foot pressure distribution ratio data instead of physical area of contact. Hence, such computational tool based system can help the clinicians for assessment of foot structure and cross-check their diagnosis of flat foot from mid-foot pressure distribution.

Keywords: arch index, computational tool, static foot pressure intensity image, foot pressure distribution, linear discriminant analysis

Procedia PDF Downloads 503