Search results for: compressed exponential
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 560

Search results for: compressed exponential

350 Electricity Demand Modeling and Forecasting in Singapore

Authors: Xian Li, Qing-Guo Wang, Jiangshuai Huang, Jidong Liu, Ming Yu, Tan Kok Poh

Abstract:

In power industry, accurate electricity demand forecasting for a certain leading time is important for system operation and control, etc. In this paper, we investigate the modeling and forecasting of Singapore’s electricity demand. Several standard models, such as HWT exponential smoothing model, the ARMA model and the ANNs model have been proposed based on historical demand data. We applied them to Singapore electricity market and proposed three refinements based on simulation to improve the modeling accuracy. Compared with existing models, our refined model can produce better forecasting accuracy. It is demonstrated in the simulation that by adding forecasting error into the forecasting equation, the modeling accuracy could be improved greatly.

Keywords: power industry, electricity demand, modeling, forecasting

Procedia PDF Downloads 619
349 Quick Sequential Search Algorithm Used to Decode High-Frequency Matrices

Authors: Mohammed M. Siddeq, Mohammed H. Rasheed, Omar M. Salih, Marcos A. Rodrigues

Abstract:

This research proposes a data encoding and decoding method based on the Matrix Minimization algorithm. This algorithm is applied to high-frequency coefficients for compression/encoding. The algorithm starts by converting every three coefficients to a single value; this is accomplished based on three different keys. The decoding/decompression uses a search method called QSS (Quick Sequential Search) Decoding Algorithm presented in this research based on the sequential search to recover the exact coefficients. In the next step, the decoded data are saved in an auxiliary array. The basic idea behind the auxiliary array is to save all possible decoded coefficients; this is because another algorithm, such as conventional sequential search, could retrieve encoded/compressed data independently from the proposed algorithm. The experimental results showed that our proposed decoding algorithm retrieves original data faster than conventional sequential search algorithms.

Keywords: matrix minimization algorithm, decoding sequential search algorithm, image compression, DCT, DWT

Procedia PDF Downloads 125
348 A Review on Big Data Movement with Different Approaches

Authors: Nay Myo Sandar

Abstract:

With the growth of technologies and applications, a large amount of data has been producing at increasing rate from various resources such as social media networks, sensor devices, and other information serving devices. This large collection of massive, complex and exponential growth of dataset is called big data. The traditional database systems cannot store and process such data due to large and complexity. Consequently, cloud computing is a potential solution for data storage and processing since it can provide a pool of resources for servers and storage. However, moving large amount of data to and from is a challenging issue since it can encounter a high latency due to large data size. With respect to big data movement problem, this paper reviews the literature of previous works, discusses about research issues, finds out approaches for dealing with big data movement problem.

Keywords: Big Data, Cloud Computing, Big Data Movement, Network Techniques

Procedia PDF Downloads 60
347 Filling the Gap of Extraction of Digital Evidence from Emerging Platforms Without Forensics Tools

Authors: Yi Anson Lam, Siu Ming Yiu, Kam Pui Chow

Abstract:

Digital evidence has been tendering to courts at an exponential rate in recent years. As an industrial practice, most digital evidence is extracted and preserved using specialized and well-accepted forensics tools. On the other hand, the advancement in technologies enables the creation of quite a few emerging platforms such as Telegram, Signal etc. Existing (well-accepted) forensics tools were not designed to extract evidence from these emerging platforms. While new forensics tools require a significant amount of time and effort to be developed and verified, this paper tries to address how to fill this gap using quick-fix alternative methods for digital evidence collection (e.g., based on APIs provided by Apps) and discuss issues related to the admissibility of this evidence to courts with support from international courts’ stance and the circumstances of accepting digital evidence using these proposed alternatives.

Keywords: extraction, digital evidence, laws, investigation

Procedia PDF Downloads 45
346 Selection of Variogram Model for Environmental Variables

Authors: Sheikh Samsuzzhan Alam

Abstract:

The present study investigates the selection of variogram model in analyzing spatial variations of environmental variables with the trend. Sometimes, the autofitted theoretical variogram does not really capture the true nature of the empirical semivariogram. So proper exploration and analysis are needed to select the best variogram model. For this study, an open source data collected from California Soil Resource Lab1 is used to explain the problems when fitting a theoretical variogram. Five most commonly used variogram models: Linear, Gaussian, Exponential, Matern, and Spherical were fitted to the experimental semivariogram. Ordinary kriging methods were considered to evaluate the accuracy of the selected variograms through cross-validation. This study is beneficial for selecting an appropriate theoretical variogram model for environmental variables.

Keywords: anisotropy, cross-validation, environmental variables, kriging, variogram models

Procedia PDF Downloads 309
345 A Bayesian Model with Improved Prior in Extreme Value Problems

Authors: Eva L. Sanjuán, Jacinto Martín, M. Isabel Parra, Mario M. Pizarro

Abstract:

In Extreme Value Theory, inference estimation for the parameters of the distribution is made employing a small part of the observation values. When block maxima values are taken, many data are discarded. We developed a new Bayesian inference model to seize all the information provided by the data, introducing informative priors and using the relations between baseline and limit parameters. Firstly, we studied the accuracy of the new model for three baseline distributions that lead to a Gumbel extreme distribution: Exponential, Normal and Gumbel. Secondly, we considered mixtures of Normal variables, to simulate practical situations when data do not adjust to pure distributions, because of perturbations (noise).

Keywords: bayesian inference, extreme value theory, Gumbel distribution, highly informative prior

Procedia PDF Downloads 172
344 Protocol for Dynamic Load Distributed Low Latency Web-Based Augmented Reality and Virtual Reality

Authors: Rohit T. P., Sahil Athrij, Sasi Gopalan

Abstract:

Currently, the content entertainment industry is dominated by mobile devices. As the trends slowly shift towards Augmented/Virtual Reality applications the computational demands on these devices are increasing exponentially and we are already reaching the limits of hardware optimizations. This paper proposes a software solution to this problem. By leveraging the capabilities of cloud computing we can offload the work from mobile devices to dedicated rendering servers that are way more powerful. But this introduces the problem of latency. This paper introduces a protocol that can achieve high-performance low latency Augmented/Virtual Reality experience. There are two parts to the protocol, 1) In-flight compression The main cause of latency in the system is the time required to transmit the camera frame from client to server. The round trip time is directly proportional to the amount of data transmitted. This can therefore be reduced by compressing the frames before sending. Using some standard compression algorithms like JPEG can result in minor size reduction only. Since the images to be compressed are consecutive camera frames there won't be a lot of changes between two consecutive images. So inter-frame compression is preferred. Inter-frame compression can be implemented efficiently using WebGL but the implementation of WebGL limits the precision of floating point numbers to 16bit in most devices. This can introduce noise to the image due to rounding errors, which will add up eventually. This can be solved using an improved interframe compression algorithm. The algorithm detects changes between frames and reuses unchanged pixels from the previous frame. This eliminates the need for floating point subtraction thereby cutting down on noise. The change detection is also improved drastically by taking the weighted average difference of pixels instead of the absolute difference. The kernel weights for this comparison can be fine-tuned to match the type of image to be compressed. 2) Dynamic Load distribution Conventional cloud computing architectures work by offloading as much work as possible to the servers, but this approach can cause a hit on bandwidth and server costs. The most optimal solution is obtained when the device utilizes 100% of its resources and the rest is done by the server. The protocol balances the load between the server and the client by doing a fraction of the computing on the device depending on the power of the device and network conditions. The protocol will be responsible for dynamically partitioning the tasks. Special flags will be used to communicate the workload fraction between the client and the server and will be updated in a constant interval of time ( or frames ). The whole of the protocol is designed so that it can be client agnostic. Flags are available to the client for resetting the frame, indicating latency, switching mode, etc. The server can react to client-side changes on the fly and adapt accordingly by switching to different pipelines. The server is designed to effectively spread the load and thereby scale horizontally. This is achieved by isolating client connections into different processes.

Keywords: 2D kernelling, augmented reality, cloud computing, dynamic load distribution, immersive experience, mobile computing, motion tracking, protocols, real-time systems, web-based augmented reality application

Procedia PDF Downloads 54
343 Mobility and Effective Regulatory Policies in the 21st Century Transport Sector

Authors: Pedro Paulino

Abstract:

The majority of the world’s population is already living in urban areas and the urban population is expected to keep increasing in the next decades. This exponential increase in urban population carries with it obvious mobility problems. Not only a new paradigm in the transport sector is needed in order to address these problems; effective regulatory policies to ensure the quality of services, passenger rights, competition between operators and consistency of the entire mobile ecosystem are needed as well. The purpose of this paper is to present the problems the world faces in this sector and contribute to their solution. Indeed, our study concludes that only through the active supervision of the markets and the activity of monitoring the various operators will it be possible to develop a sustainable and efficient transport system which meets the needs of a changing world.

Keywords: mobility, regulation policies, sanctioning powers, sustainable transport

Procedia PDF Downloads 283
342 A Simulation Study of Direct Injection Compressed Natural Gas Spark Ignition Engine Performance Utilizing Turbulent Jet Ignition with Controlled Air Charge

Authors: Siyamak Ziyaei, Siti Khalijah Mazlan, Petros Lappas

Abstract:

Compressed Natural Gas (CNG) mainly consists of Methane CH₄ and has a low carbon to hydrogen ratio relative to other hydrocarbons. As a result, it has the potential to reduce CO₂ emissions by more than 20% relative to conventional fuels like diesel or gasoline Although Natural Gas (NG) has environmental advantages compared to other hydrocarbon fuels whether they are gaseous or liquid, its main component, CH₄, burns at a slower rate than conventional fuels A higher pressure and a leaner cylinder environment will overemphasize slow burn characteristic of CH₄. Lean combustion and high compression ratios are well-known methods for increasing the efficiency of internal combustion engines. In order to achieve successful CNG lean combustion in Spark Ignition (SI) engines, a strong ignition system is essential to avoid engine misfires, especially in ultra-lean conditions. Turbulent Jet Ignition (TJI) is an ignition system that employs a pre-combustion chamber to ignite the lean fuel mixture in the main combustion chamber using a fraction of the total fuel per cycle. TJI enables ultra-lean combustion by providing distributed ignition sites through orifices. The fast burn rate provided by TJI enables the ordinary SI engine to be comparable to other combustion systems such as Homogeneous Charge Compression Ignition (HCCI) or Controlled Auto-Ignition (CAI) in terms of thermal efficiency, through the increased levels of dilution without the need of sophisticated control systems. Due to the physical geometry of TJIs, which contain small orifices that connect the prechamber to the main chamber, scavenging is one of the main factors that reduce TJI performance. Specifically, providing the right mixture of fuel and air has been identified as a key challenge. The reason for this is the insufficient amount of air that is pushed into the pre-chamber during each compression stroke. There is also the problem that combustion residual gases such as CO₂, CO and NOx from the previous combustion cycle dilute the pre- chamber fuel-air mixture preventing rapid combustion in the pre-chamber. An air-controlled active TJI is presented in this paper in order to address these issues. By applying air to the pre-chamber at a sufficient pressure, residual gases are exhausted, and the air-fuel ratio is controlled within the pre-chamber, thereby improving the quality of combustion. This paper investigates the 3D-simulated combustion characteristics of a Direct Injected (DI-CNG) fuelled SI en- gine with a pre-chamber equipped with an air channel by using AVL FIRE software. Experiments and simulations were performed at the Worldwide Mapping Point (WWMP) at 1500 Revolutions Per Minute (RPM), 3.3 bar Indicated Mean Effective Pressure (IMEP), using only conventional spark plugs as the baseline. After validating simulation data, baseline engine conditions were set for all simulation scenarios at λ=1. Following that, the pre-chambers with and without an auxiliary fuel supply were simulated. In the simulated (DI-CNG) SI engine, active TJI was observed to perform better than passive TJI and spark plug. In conclusion, the active pre-chamber with an air channel demon-strated an improved thermal efficiency (ηth) over other counterparts and conventional spark ignition systems.

Keywords: turbulent jet ignition, active air control turbulent jet ignition, pre-chamber ignition system, active and passive pre-chamber, thermal efficiency, methane combustion, internal combustion engine combustion emissions

Procedia PDF Downloads 70
341 Learning Curve Effect on Materials Procurement Schedule of Multiple Sister Ships

Authors: Vijaya Dixit Aasheesh Dixit

Abstract:

Shipbuilding industry operates in Engineer Procure Construct (EPC) context. Product mix of a shipyard comprises of various types of ships like bulk carriers, tankers, barges, coast guard vessels, sub-marines etc. Each order is unique based on the type of ship and customized requirements, which are engineered into the product right from design stage. Thus, to execute every new project, a shipyard needs to upgrade its production expertise. As a result, over the long run, holistic learning occurs across different types of projects which contributes to the knowledge base of the shipyard. Simultaneously, in the short term, during execution of a project comprising of multiple sister ships, repetition of similar tasks leads to learning at activity level. This research aims to capture above learnings of a shipyard and incorporate learning curve effect in project scheduling and materials procurement to improve project performance. Extant literature provides support for the existence of such learnings in an organization. In shipbuilding, there are sequences of similar activities which are expected to exhibit learning curve behavior. For example, the nearly identical structural sub-blocks which are successively fabricated, erected, and outfitted with piping and electrical systems. Learning curve representation can model not only a decrease in mean completion time of an activity, but also a decrease in uncertainty of activity duration. Sister ships have similar material requirements. The same supplier base supplies materials for all the sister ships within a project. On one hand, this provides an opportunity to reduce transportation cost by batching the order quantities of multiple ships. On the other hand, it increases the inventory holding cost at shipyard and the risk of obsolescence. Further, due to learning curve effect the production scheduled of each consequent ship gets compressed. Thus, the material requirement schedule of every next ship differs from its previous ship. As more and more ships get constructed, compressed production schedules increase the possibility of batching the orders of sister ships. This work aims at integrating materials management with project scheduling of long duration projects for manufacturing of multiple sister ships. It incorporates the learning curve effect on progressively compressing material requirement schedules and addresses the above trade-off of transportation cost and inventory holding and shortage costs while satisfying budget constraints of various stages of the project. The activity durations and lead time of items are not crisp and are available in the form of probabilistic distribution. A Stochastic Mixed Integer Programming (SMIP) model is formulated which is solved using evolutionary algorithm. Its output provides ordering dates of items and degree of order batching for all types of items. Sensitivity analysis determines the threshold number of sister ships required in a project to leverage the advantage of learning curve effect in materials management decisions. This analysis will help materials managers to gain insights about the scenarios: when and to what degree is it beneficial to treat a multiple ship project as an integrated one by batching the order quantities and when and to what degree to practice distinctive procurement for individual ship.

Keywords: learning curve, materials management, shipbuilding, sister ships

Procedia PDF Downloads 482
340 Modelling of Hydric Behaviour of Textiles

Authors: A. Marolleau, F. Salaun, D. Dupont, H. Gidik, S. Ducept.

Abstract:

The goal of this study is to analyze the hydric behaviour of textiles which can impact significantly the comfort of the wearer. Indeed, fabrics can be adapted for different climate if hydric and thermal behaviors are known. In this study, fabrics are only submitted to hydric variations. Sorption and desorption isotherms obtained from the dynamic vapour sorption apparatus (DVS) are fitted with the parallel exponential kinetics (PEK), the Hailwood-Horrobin (HH) and the Brunauer-Emmett-Teller (BET) models. One of the major finding is the relationship existing between PEK and HH models. During slow and fast processes, the sorption of water molecules on the polymer can be in monolayer and multilayer form. According to the BET model, moisture regain, a physical property of textiles, show a linear correlation with the total amount of water taken in monolayer. This study provides potential information of the end uses of these fabrics according to the selected activity level.

Keywords: comfort, hydric properties, modelling, underwears

Procedia PDF Downloads 129
339 The Convolution Recurrent Network of Using Residual LSTM to Process the Output of the Downsampling for Monaural Speech Enhancement

Authors: Shibo Wei, Ting Jiang

Abstract:

Convolutional-recurrent neural networks (CRN) have achieved much success recently in the speech enhancement field. The common processing method is to use the convolution layer to compress the feature space by multiple upsampling and then model the compressed features with the LSTM layer. At last, the enhanced speech is obtained by deconvolution operation to integrate the global information of the speech sequence. However, the feature space compression process may cause the loss of information, so we propose to model the upsampling result of each step with the residual LSTM layer, then join it with the output of the deconvolution layer and input them to the next deconvolution layer, by this way, we want to integrate the global information of speech sequence better. The experimental results show the network model (RES-CRN) we introduce can achieve better performance than LSTM without residual and overlaying LSTM simply in the original CRN in terms of scale-invariant signal-to-distortion ratio (SI-SNR), speech quality (PESQ), and intelligibility (STOI).

Keywords: convolutional-recurrent neural networks, speech enhancement, residual LSTM, SI-SNR

Procedia PDF Downloads 175
338 Analysis and Modeling of Stresses and Creeps Resulting from Soil Mechanics in Southern Plains of Kerman Province

Authors: Kourosh Nazarian

Abstract:

Many of the engineering materials, such as behavioral metals, have at least a certain level of linear behavior. It means that if the stresses are doubled, the deformations would be also doubled. In fact, these materials have linear elastic properties. Soils do not follow this law, for example, when compressed, soils become gradually tighter. On the surface of the ground, the sand can be easily deformed with a finger, but in high compressive stresses, they gain considerable hardness and strength. This is mainly due to the increase in the forces among the separate particles. Creeps also deform the soils under a constant load over time. Clay and peat soils have creep behavior. As a result of this phenomenon, structures constructed on such soils will continue their collapse over time. In this paper, the researchers analyzed and modeled the stresses and creeps in the southern plains of Kerman province in Iran through library-documentary, quantitative and software techniques, and field survey. The results of the modeling showed that these plains experienced severe stresses and had a collapse of about 26 cm in the last 15 years and also creep evidence was discovered in an area with a gradient of 3-6 degrees.

Keywords: Stress, creep, faryab, surface runoff

Procedia PDF Downloads 162
337 Electron Beam Processing of Ethylene-Propylene-Terpolymer-Based Rubber Mixtures

Authors: M. D. Stelescu, E. Manaila, G. Craciun, D. Ighigeanu

Abstract:

The goal of the paper is to present the results regarding the influence of the irradiation dose and amount of multifunctional monomer trimethylol-propane trimethacrylate (TMPT) on ethylene-propylene-diene terpolymer rubber (EPDM) mixtures irradiated in electron beam. Blends, molded on an electrically heated laboratory roller mill and compressed in an electrically heated hydraulic press, were irradiated using the ALID 7 of 5.5 MeV linear accelerator in the dose range of 22.6 kGy to 56.5 kGy in atmospheric conditions and at room temperature of 25 °C. The share of cross-linking and degradation reactions was evaluated by means of sol-gel analysis, cross-linking density measurements, FTIR studies and Charlesby-Pinner parameter (p0/q0) calculations. The blends containing different concentrations of TMPT (3 phr and 9 phr) and irradiated with doses in the mentioned range have present the increasing of gel content and cross-linking density. Modified and new bands in FTIR spectra have appeared, because of both cross-linking and chain scission reactions.

Keywords: electron beam irradiation, EPDM rubber, crosslinking density, gel fraction

Procedia PDF Downloads 134
336 Protecting Privacy and Data Security in Online Business

Authors: Bilquis Ferdousi

Abstract:

With the exponential growth of the online business, the threat to consumers’ privacy and data security has become a serious challenge. This literature review-based study focuses on a better understanding of those threats and what legislative measures have been taken to address those challenges. Research shows that people are increasingly involved in online business using different digital devices and platforms, although this practice varies based on age groups. The threat to consumers’ privacy and data security is a serious hindrance in developing trust among consumers in online businesses. There are some legislative measures taken at the federal and state level to protect consumers’ privacy and data security. The study was based on an extensive review of current literature on protecting consumers’ privacy and data security and legislative measures that have been taken.

Keywords: privacy, data security, legislation, online business

Procedia PDF Downloads 81
335 Stabilizing Effect of Magnetic Field in a Thermally Modulated Porous Layer

Authors: M. Meenasaranya, S. Saravanan

Abstract:

Nonlinear stability analysis is carried out to determine the effect of surface temperature modulation in an infinite horizontal porous layer heated from below. The layer is saturated by an electrically conducting, viscous, incompressible and Newtonian fluid. The Brinkman model is used for momentum equation, and the Boussinesq approximation is invoked. The system is assumed to be bounded by rigid boundaries. The energy theory is implemented to find the global exponential stability region of the considered system. The results are analysed for arbitrary values of modulation frequency and amplitude. The existence of subcritical instability region is confirmed by comparing the obtained result with the known linear result. The vertical magnetic field is found to stabilize the system.

Keywords: Brinkman model, energy method, magnetic field, surface temperature modulation

Procedia PDF Downloads 372
334 Enhanced Bit Error Rate in Visible Light Communication: A New LED Hexagonal Array Distribution

Authors: Karim Matter, Heba Fayed, Ahmed Abd-Elaziz, Moustafa Hussein

Abstract:

Due to the exponential growth of mobile devices and wireless services, a huge demand for radiofrequency has increased. The presence of several frequencies causes interference between cells, which must be minimized to get the lower Bit Error Rate (BER). For this reason, it is of great interest to use visible light communication (VLC). This paper suggests a VLC system that decreases the BER by applying a new LED distribution with a hexagonal shape using a Frequency Reuse (FR) concept to mitigate the interference between the reused frequencies inside the hexagonal shape. The BER is measured in two scenarios, Line of Sight (LoS) and Non-Line of Sight (Non-LoS), for each technique that we used. The recommended values of BER in the proposed model for Soft Frequency Reuse (SFR) in the case of Los at 4, 8, and 10 dB signal to noise ratio (SNR), are 3.6×10⁻⁶, 6.03×10⁻¹³, and 2.66×10⁻¹⁸, respectively.

Keywords: visible light communication (VLC), field of view (FoV), hexagonal array, frequency reuse

Procedia PDF Downloads 140
333 Improving the Performance of Requisition Document Online System for Royal Thai Army by Using Time Series Model

Authors: D. Prangchumpol

Abstract:

This research presents a forecasting method of requisition document demands for Military units by using Exponential Smoothing methods to analyze data. The data used in the forecast is an actual data requisition document of The Adjutant General Department. The results of the forecasting model to forecast the requisition of the document found that Holt–Winters’ trend and seasonality method of α=0.1, β=0, γ=0 is appropriate and matches for requisition of documents. In addition, the researcher has developed a requisition online system to improve the performance of requisition documents of The Adjutant General Department, and also ensuring that the operation can be checked.

Keywords: requisition, holt–winters, time series, royal thai army

Procedia PDF Downloads 286
332 An Efficient Encryption Scheme Using DWT and Arnold Transforms

Authors: Ali Abdrhman M. Ukasha

Abstract:

Data security needed in data transmission, storage, and communication to ensure the security. The color image is decomposed into red, green, and blue channels. The blue and green channels are compressed using 3-levels discrete wavelet transform. The Arnold transform uses to changes the locations of red image channel pixels as image scrambling process. Then all these channels are encrypted separately using a key image that has same original size and is generating using private keys and modulo operations. Performing the X-OR and modulo operations between the encrypted channels images for image pixel values change purpose. The extracted contours of color image recovery can be obtained with accepted level of distortion using Canny edge detector. Experiments have demonstrated that proposed algorithm can fully encrypt 2D color image and completely reconstructed without any distortion. It has shown that the color image can be protected with a higher security level. The presented method has easy hardware implementation and suitable for multimedia protection in real time applications such as wireless networks and mobile phone services.

Keywords: color image, wavelet transform, edge detector, Arnold transform, lossy image encryption

Procedia PDF Downloads 457
331 Queueing Modeling of M/G/1 Fault Tolerant System with Threshold Recovery and Imperfect Coverage

Authors: Madhu Jain, Rakesh Kumar Meena

Abstract:

This paper investigates a finite M/G/1 fault tolerant multi-component machining system. The system incorporates the features such as standby support, threshold recovery and imperfect coverage make the study closer to real time systems. The performance prediction of M/G/1 fault tolerant system is carried out using recursive approach by treating remaining service time as a supplementary variable. The numerical results are presented to illustrate the computational tractability of analytical results by taking three different service time distributions viz. exponential, 3-stage Erlang and deterministic. Moreover, the cost function is constructed to determine the optimal choice of system descriptors to upgrading the system.

Keywords: fault tolerant, machine repair, threshold recovery policy, imperfect coverage, supplementary variable technique

Procedia PDF Downloads 269
330 Partially Knowing of Least Support Orthogonal Matching Pursuit (PKLS-OMP) for Recovering Signal

Authors: Israa Sh. Tawfic, Sema Koc Kayhan

Abstract:

Given a large sparse signal, great wishes are to reconstruct the signal precisely and accurately from lease number of measurements as possible as it could. Although this seems possible by theory, the difficulty is in built an algorithm to perform the accuracy and efficiency of reconstructing. This paper proposes a new proved method to reconstruct sparse signal depend on using new method called Least Support Matching Pursuit (LS-OMP) merge it with the theory of Partial Knowing Support (PSK) given new method called Partially Knowing of Least Support Orthogonal Matching Pursuit (PKLS-OMP). The new methods depend on the greedy algorithm to compute the support which depends on the number of iterations. So to make it faster, the PKLS-OMP adds the idea of partial knowing support of its algorithm. It shows the efficiency, simplicity, and accuracy to get back the original signal if the sampling matrix satisfies the Restricted Isometry Property (RIP). Simulation results also show that it outperforms many algorithms especially for compressible signals.

Keywords: compressed sensing, lest support orthogonal matching pursuit, partial knowing support, restricted isometry property, signal reconstruction

Procedia PDF Downloads 222
329 Object Trajectory Extraction by Using Mean of Motion Vectors Form Compressed Video Bitstream

Authors: Ching-Ting Hsu, Wei-Hua Ho, Yi-Chun Chang

Abstract:

Video object tracking is one of the popular research topics in computer graphics area. The trajectory can be applied in security, traffic control, even the sports training. The trajectory for sports training can be utilized to analyze the athlete’s performance without traditional sensors. There are many relevant works which utilize mean shift algorithm with background subtraction. This kind of the schemes should select a kernel function which may affect the accuracy and performance. In this paper, we consider the motion information in the pre-coded bitstream. The proposed algorithm extracts the trajectory by composing the motion vectors from the pre-coded bitstream. We gather the motion vectors from the overlap area of the object and calculate mean of the overlapped motion vectors. We implement and simulate our proposed algorithm in H.264 video codec. The performance is better than relevant works and keeps the accuracy of the object trajectory. The experimental results show that the proposed trajectory extraction can extract trajectory form the pre-coded bitstream in high accuracy and achieve higher performance other relevant works.

Keywords: H.264, video bitstream, video object tracking, sports training

Procedia PDF Downloads 410
328 Comparison of Prognostic Models in Different Scenarios of Shoreline Position on Ponta Negra Beach in Northeastern Brazil

Authors: Débora V. Busman, Venerando E. Amaro, Mattheus da C. Prudêncio

Abstract:

Prognostic studies of the shoreline are of utmost importance for Ponta Negra Beach, located in Natal, Northeastern Brazil, where the infrastructure recently built along the shoreline is severely affected by flooding and erosion. This study compares shoreline predictions using three linear regression methods (LMS, LRR and WLR) and tries to discern the best method for different shoreline position scenarios. The methods have shown erosion on the beach in each of the scenarios tested, even in less intense dynamic conditions. The WLA_A with confidence interval of 95% was the well-adjusted model and calculated a retreat of -1.25 m/yr to -2.0 m/yr in hot spot areas. The change of the shoreline on Ponta Negra Beach can be measured as a negative exponential curve. Analysis of these methods has shown a correlation with the morphodynamic stage of the beach.

Keywords: coastal erosion, prognostic model, DSAS, environmental safety

Procedia PDF Downloads 312
327 Constructability Driven Engineering in Oil and Gas Projects

Authors: Srikanth Nagarajan, P. Parthasarathy, Frits Lagers

Abstract:

Lower crude oil prices increased the pressure on oil and gas projects. Being competitive becomes very important and critical for the success in any industry. Increase in size of the project multiplies the magnitude of the issue. Timely completion of projects within the budget and schedule is very important for any project to succeed. A simple idea makes a larger impact on the total cost of the plant. In this robust world, the phases of engineering right from licensing technology, feed, different phases of detail engineering, procurement and construction has been so much compressed that they overlap with each other. Hence constructability techniques have become very important. Here in this paper, the focus will be on how these techniques can be implemented and reduce cost with the help of a case study. Constructability is a process driven by the need to impact project’s construction phase resulting in improved project delivery, costs and schedule. In construction phase of one of our fast-track mega project, it was noticed that there was an opportunity to reduce significant amount of cost and schedule by implementing Constructability study processes. In this case study, the actual methodology adopted during engineering and construction and the way for doing it better by implementing Constructability techniques with collaborative engineering efforts will be explained.

Keywords: being competitive, collaborative engineering, constructability, cost reduction

Procedia PDF Downloads 392
326 Response of Vibration and Damping System of UV Irradiated Renewable Biopolymer

Authors: Anika Zafiah M. Rus, Nik Normunira Mat Hassan

Abstract:

Biopolymer made from renewable material are one of the most important group of polymer because of their versatility and they can be manufactured in a wide range of densities and stiffness. In this project, biopolymer based on waste vegetable oil were synthesized and crosslink with commercial polymethane polyphenyl isocyanate (known as BF).The BF was compressed by using hot compression moulding technique at 90 oC based on the evaporation of volatile matter and known as compress biopolymer (CB). The density, vibration and damping characteristic of CB were determined after UV irradiation. Treatment with titanium dioxide (TiO2) was found to affect the physical property of compress biopolymer composite (CBC). The density of CBC samples was steadily increased with an increase of UV irradiation time and TiO2 loading. The highest density of CBC samples is at 10 % of TiO2 loading of 1.1088 g/cm3 due to the amount of filler loading. The vibration and damping characteristic of CBC samples was generated at displacements of 1 mm and 1.5 mm and acceleration of 0.1 G and 0.15 G base excitation according to ASTM D3580-9. It was revealed that, the vibration and damping characteristic of CBC samples is significantly increased with the increasing of UV irradiation time, lowest thickness and percentages of TiO2 loading at the frequency range of 15 - 25 Hz. Therefore, this study indicated that the damping property of CBC could be improved upon prolonged exposure to UV irradiation.

Keywords: biopolymer flexible foam, TGA, UV irradiation, vibration and damping

Procedia PDF Downloads 445
325 Pyrolysis and Combustion Kinetics of Palm Kernel Shell Using Thermogravimetric Analysis

Authors: Kanit Manatura

Abstract:

The combustion and pyrolysis behavior of Palm Kernel Shell (PKS) were investigated in a thermogravimetric analyzer. A 10 mg sample of each biomass was heated from 30 °C to 800 °C at four heating rates (within 5, 10, 15 and 30 °C/min) in nitrogen and dry air flow of 20 ml/min instead of pyrolysis and combustion process respectively. During pyrolysis, thermal decomposition occurred on three different stages include dehydration, hemicellulose-cellulose and lignin decomposition on each temperature range. The TG/DTG curves showed the degradation behavior and the pyrolysis/combustion characteristics of the PKS samples which led to apply in thermogravimetric analysis. The kinetic factors including activation energy and pre-exponential factor were determined by the Coats-Redfern method. The obtained kinetic factors are used to simulate the thermal decomposition and compare with experimental data. Rising heating rate leads to shift the mass loss towards higher temperature.

Keywords: combustion, palm kernel shell, pyrolysis, thermogravimetric analyzer

Procedia PDF Downloads 203
324 Flange/Web Distortional Buckling of Cold-Formed Steel Beams with Web Holes under Pure Bending

Authors: Nan-Ting Yu, Boksun Kim, Long-Yuan Li

Abstract:

The cold-formed steel beams with web holes are widely used as the load-carrying members in structural engineering. The perforations can release the space of the building and let the pipes go through. However, the perforated cold-formed steel (PCFS) beams may fail by distortional buckling more easily than beams with plain web; this is because the rotational stiffness from the web decreases. It is well known that the distortional buckling can be described as the buckling of the compressed flange-lip system. In fact, near the ultimate failure, the flange/web corner would move laterally, which indicates the bending of the web should be taken account. The purpose of this study is to give a specific solution for the critical stress of flange/web distortional buckling of PCFS beams. The new model is deduced based on classical energy method, and the deflection of the web is represented by the shape function of the plane beam element. The finite element analyses have been performed to validate the accuracy of the proposed model. The comparison of the critical stress calculated from Hancock's model, FEA, and present model, shows that the present model can provide a splendid prediction for the flange/web distortional buckling of PCFS beams.

Keywords: cold-formed steel, beams, perforations, flange-web distortional buckling, finite element analysis

Procedia PDF Downloads 108
323 Towards a Rigorous Analysis for a Supercritical Particulate Process

Authors: Yousef Bakhbakhi

Abstract:

Crystallization with supercritical fluids (SCFs), as a developed technology to produce particles of micron and sub-micron size with narrow size distribution, has found appreciable importance as an environmentally friendly technology. Particle synthesis using SCFs can be achieved employing a number of special processes involving solvent and antisolvent mechanisms. In this study, the compressed antisolvent (PCA) process is utilized as a model to analyze the theoretical complexity of crystallization with supercritical fluids. The population balance approach has proven to be an effectual technique to simulate and predict the particle size and size distribution. The nucleation and growth mechanisms of the particles formation in the PCA process is investigated using the population balance equation, which describes the evolution of the particle through coalescence and breakup levels with time. The employed mathematical population balance model contains a set of the partial differential equation with algebraic constraints, which demands a rigorous numerical approach. The combined Collocation and Galerkin finite element method are proposed as a high-resolution technique to solve the dynamics of the PCA process.

Keywords: particle formation, particle size and size distribution, PCA, supercritical carbon dioxide

Procedia PDF Downloads 175
322 The Next Generation of Mucoadhesive Polymer

Authors: Flavia Laffleur, Andreas Bernkop-Schnürch

Abstract:

Purpose: This study was aimed to investigate preactivated thiomers for their mucoadhesive potential. Methods: Accordingly, chitosan-thioglycolic-mercaptonicotinamide conjugates (chitosan-TGA-MNA) were synthesized by the oxidative S-S coupling of chitosan-thioglycolic acid (chitosan-TGA) with 6-mercaptonicotin amide (MNA). Unmodified chitosan, chitosan-TGA (thiomers) and chitosan-TGA-MNA conjugates were compressed into test discs to investigate cohesive properties, cytotoxicity assays and mucoadhesion studies. Results: Due to the immobilization of MNA, the chitosan-TGA-MNA conjugates exhibit comparatively higher swelling properties and cohesive properties corresponding unmodified chitosan. On the rotating cylinder, discs based on chitosan-TGA-MNA conjugates displayed 3.1-fold improved mucoadhesion time compared to thiolated polymers. Tensile study results were found in good agreement with rotating cylinder results. Moreover, preactivated thiomers showed higher stability. All polymers were found non-toxic over Caco-2 cells. Conclusion: On the basis of achieved results the pre activated thiomeric therapeutic agent seems to represent a promising generation of mucoadhesive polymers which are safe to use for a prolonged residence time to target the mucosa.

Keywords: biomedical application, drug delivery, polymer, thiomer

Procedia PDF Downloads 416
321 Identifying the Factors affecting on the Success of Energy Usage Saving in Municipality of Tehran

Authors: Rojin Bana Derakhshan, Abbas Toloie

Abstract:

For the purpose of optimizing and developing energy efficiency in building, it is required to recognize key elements of success in optimization of energy consumption before performing any actions. Surveying Principal Components is one of the most valuable result of Linear Algebra because the simple and non-parametric methods are become confusing. So that energy management system implemented according to energy management system international standard ISO50001:2011 and all energy parameters in building to be measured through performing energy auditing. In this essay by simulating used of data mining, the key impressive elements on energy saving in buildings to be determined. This approach is based on data mining statistical techniques using feature selection method and fuzzy logic and convert data from massive to compressed type and used to increase the selected feature. On the other side, influence portion and amount of each energy consumption elements in energy dissipation in percent are recognized as separated norm while using obtained results from energy auditing and after measurement of all energy consuming parameters and identified variables. Accordingly, energy saving solution divided into 3 categories, low, medium and high expense solutions.

Keywords: energy saving, key elements of success, optimization of energy consumption, data mining

Procedia PDF Downloads 445