Search results for: rice processing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4238

Search results for: rice processing

1598 Expanding Trading Strategies By Studying Sentiment Correlation With Data Mining Techniques

Authors: Ved Kulkarni, Karthik Kini

Abstract:

This experiment aims to understand how the media affects the power markets in the mainland United States and study the duration of reaction time between news updates and actual price movements. it have taken into account electric utility companies trading in the NYSE and excluded companies that are more politically involved and move with higher sensitivity to Politics. The scrapper checks for any news related to keywords, which are predefined and stored for each specific company. Based on this, the classifier will allocate the effect into five categories: positive, negative, highly optimistic, highly negative, or neutral. The effect on the respective price movement will be studied to understand the response time. Based on the response time observed, neural networks would be trained to understand and react to changing market conditions, achieving the best strategy in every market. The stock trader would be day trading in the first phase and making option strategy predictions based on the black holes model. The expected result is to create an AI-based system that adjusts trading strategies within the market response time to each price movement.

Keywords: data mining, language processing, artificial neural networks, sentiment analysis

Procedia PDF Downloads 17
1597 An Image Enhancement Method Based on Curvelet Transform for CBCT-Images

Authors: Shahriar Farzam, Maryam Rastgarpour

Abstract:

Image denoising plays extremely important role in digital image processing. Enhancement of clinical image research based on Curvelet has been developed rapidly in recent years. In this paper, we present a method for image contrast enhancement for cone beam CT (CBCT) images based on fast discrete curvelet transforms (FDCT) that work through Unequally Spaced Fast Fourier Transform (USFFT). These transforms return a table of Curvelet transform coefficients indexed by a scale parameter, an orientation and a spatial location. Accordingly, the coefficients obtained from FDCT-USFFT can be modified in order to enhance contrast in an image. Our proposed method first uses a two-dimensional mathematical transform, namely the FDCT through unequal-space fast Fourier transform on input image and then applies thresholding on coefficients of Curvelet to enhance the CBCT images. Consequently, applying unequal-space fast Fourier Transform leads to an accurate reconstruction of the image with high resolution. The experimental results indicate the performance of the proposed method is superior to the existing ones in terms of Peak Signal to Noise Ratio (PSNR) and Effective Measure of Enhancement (EME).

Keywords: curvelet transform, CBCT, image enhancement, image denoising

Procedia PDF Downloads 300
1596 Hybrid Algorithm for Non-Negative Matrix Factorization Based on Symmetric Kullback-Leibler Divergence for Signal Dependent Noise: A Case Study

Authors: Ana Serafimovic, Karthik Devarajan

Abstract:

Non-negative matrix factorization approximates a high dimensional non-negative matrix V as the product of two non-negative matrices, W and H, and allows only additive linear combinations of data, enabling it to learn parts with representations in reality. It has been successfully applied in the analysis and interpretation of high dimensional data arising in neuroscience, computational biology, and natural language processing, to name a few. The objective of this paper is to assess a hybrid algorithm for non-negative matrix factorization with multiplicative updates. The method aims to minimize the symmetric version of Kullback-Leibler divergence known as intrinsic information and assumes that the noise is signal-dependent and that it originates from an arbitrary distribution from the exponential family. It is a generalization of currently available algorithms for Gaussian, Poisson, gamma and inverse Gaussian noise. We demonstrate the potential usefulness of the new generalized algorithm by comparing its performance to the baseline methods which also aim to minimize symmetric divergence measures.

Keywords: non-negative matrix factorization, dimension reduction, clustering, intrinsic information, symmetric information divergence, signal-dependent noise, exponential family, generalized Kullback-Leibler divergence, dual divergence

Procedia PDF Downloads 246
1595 Enhance Biogas Production by Enzymatic Pre-Treatment from Palm Oil Mill Effluent (POME)

Authors: M. S. Tajul Islam, Md. Zahangir Alam

Abstract:

To enhance biogas production through anaerobic digestion, the application of various type of pre-treatment method has some limitations in terms of sustainable environmental management. Many studies on pretreatments especially chemical and physical processes are carried out to evaluate the anaerobic digestion for enhanced biogas production. Among the pretreatment methods acid and alkali pre-treatments gained the highest importance. Previous studies have showed that although acid and alkali pretreatment has significant effect on degradation of biomass, these methods have some negative impact on environment due to their hazard in nature while enzymatic pre-treatment is environmentally friendly. One of the constrains to use of enzyme in pretreatment process for biogas production is high cost which is currently focused to reduce cost through fermentation of waste-based media. As such palm oil mill effluent (POME) as an abundant resource generated during palm oil processing at mill is being used a potential fermentation media for enzyme production. This low cost of enzyme could be an alternative to biogas pretreatment process. This review is to focus direct application of enzyme as enzymatic pre-treatment on POME to enhanced production of biogas.

Keywords: POME, enzymatic pre-treatment, biogas, lignocellulosic biomass, anaerobic digestion

Procedia PDF Downloads 550
1594 Increasing Redness and Microbial Stability of Low Nitrite Chicken Sausage by Encapsulated Tomato Pomace Extract

Authors: Bung-Orn Hemung, Nachayut Chanshotigul, Koo Bok Chin

Abstract:

Tomato pomace (TP) is the waste from tomato processing plants and its utilization as food ingredient may provide sustainable industry by reducing waste. TP was extracted by ethanol using microwave-assisted method at 180W for 90s. The ethanol was evaporated out, and an extract was encapsulated with maltodextrin (1:10) by spray drying to obtain an encapsulated TP extract (ETPE). The redness (a value) of ETPE powder was 6.5±0.05, and it was used as natural ingredient in the low-nitrite chicken sausage. Chicken emulsion sausage was prepared at 25 mg/kg of nitrite for being control. Effect of ETPE (1.0%) was evaluated along with the reference (150 mg/kg of nitrite without ETPE). The redness (a value) of sausage with ETPE was found at 6.8±0.03, which was higher than those of reference and control, which were at 4.8±.022 and 5.1±0.15, respectively. However, hardness, expressible moisture content and cooking yield values were reduced slightly. During storage at 10 °C in the air packed condition for 1 week, changes in color, pH, redness, and thiobarbituric acid reactive substances value were not significantly different. However, total microbial count of sausage samples with ETPE was lower than control for a 1 log cycle, suggesting microbial stability. Therefore, the addition of ETPE could be an alternative strategy to utilize TP as a natural colorant and antimicrobial agent to extend the shelf life of low-nitrite chicken sausage.

Keywords: antimicrobial ingredient, chicken sausage, ethanolic extract, low-nitrite sausage, tomato pomace

Procedia PDF Downloads 208
1593 Ambiguity Resolution for Ground-based Pulse Doppler Radars Using Multiple Medium Pulse Repetition Frequency

Authors: Khue Nguyen Dinh, Loi Nguyen Van, Thanh Nguyen Nhu

Abstract:

In this paper, we propose an adaptive method to resolve ambiguities and a ghost target removal process to extract targets detected by a ground-based pulse-Doppler radar using medium pulse repetition frequency (PRF) waveforms. The ambiguity resolution method is an adaptive implementation of the coincidence algorithm, which is implemented on a two-dimensional (2D) range-velocity matrix to resolve range and velocity ambiguities simultaneously, with a proposed clustering filter to enhance the anti-error ability of the system. Here we consider the scenario of multiple target environments. The ghost target removal process, which is based on the power after Doppler processing, is proposed to mitigate ghosting detections to enhance the performance of ground-based radars using a short PRF schedule in multiple target environments. Simulation results on a ground-based pulsed Doppler radar model will be presented to show the effectiveness of the proposed approach.

Keywords: ambiguity resolution, coincidence algorithm, medium PRF, ghosting removal

Procedia PDF Downloads 151
1592 Inversion of Electrical Resistivity Data: A Review

Authors: Shrey Sharma, Gunjan Kumar Verma

Abstract:

High density electrical prospecting has been widely used in groundwater investigation, civil engineering and environmental survey. For efficient inversion, the forward modeling routine, sensitivity calculation, and inversion algorithm must be efficient. This paper attempts to provide a brief summary of the past and ongoing developments of the method. It includes reviews of the procedures used for data acquisition, processing and inversion of electrical resistivity data based on compilation of academic literature. In recent times there had been a significant evolution in field survey designs and data inversion techniques for the resistivity method. In general 2-D inversion for resistivity data is carried out using the linearized least-square method with the local optimization technique .Multi-electrode and multi-channel systems have made it possible to conduct large 2-D, 3-D and even 4-D surveys efficiently to resolve complex geological structures that were not possible with traditional 1-D surveys. 3-D surveys play an increasingly important role in very complex areas where 2-D models suffer from artifacts due to off-line structures. Continued developments in computation technology, as well as fast data inversion techniques and software, have made it possible to use optimization techniques to obtain model parameters to a higher accuracy. A brief discussion on the limitations of the electrical resistivity method has also been presented.

Keywords: inversion, limitations, optimization, resistivity

Procedia PDF Downloads 365
1591 Formulation of Mortars with Marine Sediments

Authors: Nor-Edine Abriak, Mouhamadou Amar, Mahfoud Benzerzour

Abstract:

The transition to a more sustainable economy is directed by a reduction in the consumption of raw materials in equivalent production. The recovery of byproducts and especially the dredged sediment as mineral addition in cements matrix represents an alternative to reduce raw material consumption and construction sector’s carbon footprint. However, the efficient use of sediment requires adequate and optimal treatment. Several processing techniques have so far been applied in order to improve some physicochemical properties. The heat treatment by calcination was effective in removing the organic fraction and activates the pozzolanic properties. In this article, the effect of the optimized heat treatment of marine sediments in the physico-mechanical and environmental properties of mortars are shown. A finding is that the optimal substitution of a portion of cement by treated sediments by calcination at 750 °C helps to maintain or improve the mechanical properties of the cement matrix in comparison with a standard reference mortar. The use of calcined sediment enhances mortar behavior in terms of mechanical strength and durability. From an environmental point of view and life cycle, mortars formulated containing treated sediments are considered inert with respect to the inert waste storage facilities reference (ISDI-France).

Keywords: sediment, calcination, cement, reuse

Procedia PDF Downloads 180
1590 Design, Optimize the Damping System for Optical Scanning Equipment

Authors: Duy Nhat Tran, Van Tien Pham, Quang Trung Trinh, Tien Hai Tran, Van Cong Bui

Abstract:

In recent years, artificial intelligence and the Internet of Things have experienced significant advancements. Collecting image data and real-time analysis and processing of tasks have become increasingly popular in various aspects of life. Optical scanning devices are widely used to observe and analyze different environments, whether fixed outdoors, mounted on mobile devices, or used in unmanned aerial vehicles. As a result, the interaction between the physical environment and these devices has become more critical in terms of safety. Two commonly used methods for addressing these challenges are active and passive approaches. Each method has its advantages and disadvantages, but combining both methods can lead to higher efficiency. One solution is to utilize direct-drive motors for position control and real-time feedback within the operational range to determine appropriate control parameters with high precision. If the maximum motor torque is smaller than the inertial torque and the rotor reaches the operational limit, the spring system absorbs the impact force. Numerous experiments have been conducted to demonstrate the effectiveness of device protection during operation.

Keywords: optical device, collision safety, collision absorption, precise mechanics

Procedia PDF Downloads 63
1589 Modification Encryption Time and Permutation in Advanced Encryption Standard Algorithm

Authors: Dalal N. Hammod, Ekhlas K. Gbashi

Abstract:

Today, cryptography is used in many applications to achieve high security in data transmission and in real-time communications. AES has long gained global acceptance and is used for securing sensitive data in various industries but has suffered from slow processing and take a large time to transfer data. This paper suggests a method to enhance Advance Encryption Standard (AES) Algorithm based on time and permutation. The suggested method (MAES) is based on modifying the SubByte and ShiftRrows in the encryption part and modification the InvSubByte and InvShiftRows in the decryption part. After the implementation of the proposal and testing the results, the Modified AES achieved good results in accomplishing the communication with high performance criteria in terms of randomness, encryption time, storage space, and avalanche effects. The proposed method has good randomness to ciphertext because this method passed NIST statistical tests against attacks; also, (MAES) reduced the encryption time by (10 %) than the time of the original AES; therefore, the modified AES is faster than the original AES. Also, the proposed method showed good results in memory utilization where the value is (54.36) for the MAES, but the value for the original AES is (66.23). Also, the avalanche effects used for calculating diffusion property are (52.08%) for the modified AES and (51.82%) percentage for the original AES.

Keywords: modified AES, randomness test, encryption time, avalanche effects

Procedia PDF Downloads 248
1588 Exploratory Analysis of A Review of Nonexistence Polarity in Native Speech

Authors: Deawan Rakin Ahamed Remal, Sinthia Chowdhury, Sharun Akter Khushbu, Sheak Rashed Haider Noori

Abstract:

Native Speech to text synthesis has its own leverage for the purpose of mankind. The extensive nature of art to speaking different accents is common but the purpose of communication between two different accent types of people is quite difficult. This problem will be motivated by the extraction of the wrong perception of language meaning. Thus, many existing automatic speech recognition has been placed to detect text. Overall study of this paper mentions a review of NSTTR (Native Speech Text to Text Recognition) synthesis compared with Text to Text recognition. Review has exposed many text to text recognition systems that are at a very early stage to comply with the system by native speech recognition. Many discussions started about the progression of chatbots, linguistic theory another is rule based approach. In the Recent years Deep learning is an overwhelming chapter for text to text learning to detect language nature. To the best of our knowledge, In the sub continent a huge number of people speak in Bangla language but they have different accents in different regions therefore study has been elaborate contradictory discussion achievement of existing works and findings of future needs in Bangla language acoustic accent.

Keywords: TTR, NSTTR, text to text recognition, deep learning, natural language processing

Procedia PDF Downloads 132
1587 The Trigger-DAQ System in the Mu2e Experiment

Authors: Antonio Gioiosa, Simone Doanti, Eric Flumerfelt, Luca Morescalchi, Elena Pedreschi, Gianantonio Pezzullo, Ryan A. Rivera, Franco Spinella

Abstract:

The Mu2e experiment at Fermilab aims to measure the charged-lepton flavour violating neutrino-less conversion of a negative muon into an electron in the field of an aluminum nucleus. With the expected experimental sensitivity, Mu2e will improve the previous limit of four orders of magnitude. The Mu2e data acquisition (DAQ) system provides hardware and software to collect digitized data from the tracker, calorimeter, cosmic ray veto, and beam monitoring systems. Mu2e’s trigger and data acquisition system (TDAQ) uses otsdaq as its solution. developed at Fermilab, otsdaq uses the artdaq DAQ framework and art analysis framework, under-the-hood, for event transfer, filtering, and processing. Otsdaq is an online DAQ software suite with a focus on flexibility and scalability while providing a multi-user, web-based interface accessible through the Chrome or Firefox web browser. The detector read out controller (ROC) from the tracker and calorimeter stream out zero-suppressed data continuously to the data transfer controller (DTC). Data is then read over the PCIe bus to a software filter algorithm that selects events which are finally combined with the data flux that comes from a cosmic ray veto system (CRV).

Keywords: trigger, daq, mu2e, Fermilab

Procedia PDF Downloads 155
1586 Grain Size Characteristics and Sediments Distribution in the Eastern Part of Lekki Lagoon

Authors: Mayowa Philips Ibitola, Abe Oluwaseun Banji, Olorunfemi Akinade-Solomon

Abstract:

A total of 20 bottom sediment samples were collected from the Lekki Lagoon during the wet and dry season. The study was carried out to determine the textural characteristics, sediment distribution pattern and energy of transportation within the lagoon system. The sediment grain sizes and depth profiling was analyzed using dry sieving method and MATLAB algorithm for processing. The granulometric reveals fine grained sand both for the wet and dry season with an average mean value of 2.03 ϕ and -2.88 ϕ, respectively. Sediments were moderately sorted with an average inclusive standard deviation of 0.77 ϕ and -0.82 ϕ. Skewness varied from strongly coarse and near symmetrical 0.34- ϕ and 0.09 ϕ. The kurtosis average value was 0.87 ϕ and -1.4 ϕ (platykurtic and leptokurtic). Entirely, the bathymetry shows an average depth of 4.0 m. The deepest and shallowest area has a depth of 11.2 m and 0.5 m, respectively. High concentration of fine sand was observed at deep areas compared to the shallow areas during wet and dry season. Statistical parameter results show that the overall sediments are sorted, and deposited under low energy condition over a long distance. However, sediment distribution and sediment transport pattern of Lekki Lagoon is controlled by a low energy current and the down slope configuration of the bathymetry enhances the sorting and the deposition rate in the Lekki Lagoon.

Keywords: Lekki Lagoon, Marine sediment, bathymetry, grain size distribution

Procedia PDF Downloads 231
1585 Electrocardiogram-Based Heartbeat Classification Using Convolutional Neural Networks

Authors: Jacqueline Rose T. Alipo-on, Francesca Isabelle F. Escobar, Myles Joshua T. Tan, Hezerul Abdul Karim, Nouar Al Dahoul

Abstract:

Electrocardiogram (ECG) signal analysis and processing are crucial in the diagnosis of cardiovascular diseases, which are considered one of the leading causes of mortality worldwide. However, the traditional rule-based analysis of large volumes of ECG data is time-consuming, labor-intensive, and prone to human errors. With the advancement of the programming paradigm, algorithms such as machine learning have been increasingly used to perform an analysis of ECG signals. In this paper, various deep learning algorithms were adapted to classify five classes of heartbeat types. The dataset used in this work is the synthetic MIT-BIH Arrhythmia dataset produced from generative adversarial networks (GANs). Various deep learning models such as ResNet-50 convolutional neural network (CNN), 1-D CNN, and long short-term memory (LSTM) were evaluated and compared. ResNet-50 was found to outperform other models in terms of recall and F1 score using a five-fold average score of 98.88% and 98.87%, respectively. 1-D CNN, on the other hand, was found to have the highest average precision of 98.93%.

Keywords: heartbeat classification, convolutional neural network, electrocardiogram signals, generative adversarial networks, long short-term memory, ResNet-50

Procedia PDF Downloads 128
1584 Thermal Decontamination of Soils Polluted by Polychlorinated Biphenyls and Microplastics

Authors: Roya Biabani, Mentore Vaccari, Piero Ferrari

Abstract:

Accumulated microplastic (MPLs) in soil pose the risk of adsorbing and transporting polychlorinated biphenyls (PCBs) into the food chain or bodies. PCBs belong to a class of man-made hydrophobic organic chemicals (HOCs) that are classified as probable human carcinogens and a hazard to biota. Therefore, to take effective action and not aggravate the already recognized problems, the knowledge of PCB remediation in the presence of MPLs needs to be complete. Due to the high efficiency and little secondary pollution production, thermal desorption (TD) has been widely used for processing a variety of pollutants, especially for removing volatile and semi-volatile organic matter from contaminated solids and sediment. This study investigates the fate of PCB compounds during the thermal remediation method. For this, the PCB-contaminated soil was collected from the earth-canal downstream Caffaro S.p.A. chemical factory, which produced PCBs and PCB mixtures between 1930 and 1984. For MPL analysis, MPLs were separated by density separation and oxidation of organic matter. An operational range for the key parameters of thermal desorption processes was experimentally evaluated. Moreover, the temperature treatment characteristics of the PCBs-contaminated soil under anaerobic and aerobic conditions were studied using the Thermogravimetric Analysis (TGA).

Keywords: contaminated soils, microplastics, polychlorinated biphenyls, thermal desorption

Procedia PDF Downloads 104
1583 Production of Bioethanol from Oil PalmTrunk by Cocktail Carbohydrases Enzyme Produced by Thermophilic Bacteria Isolated from Hot spring in West Sumatera, Indonesia

Authors: Yetti Marlida, Syukri Arif, Nadirman Haska

Abstract:

Recently, alcohol fuels have been produced on industrial scales by fermentation of sugars derived from wheat, corn, sugar beets, sugar cane etc. The enzymatic hydrolysis of cellulosic materials to produce fermentable sugars has an enormous potential in meeting global bioenergy demand through the biorefinery concept, since agri-food processes generate millions of tones of waste each year (Xeros and Christakopoulos 2009) such as sugar cane baggase , wheat straw, rice straw, corn cob, and oil palm trunk. In fact oil palm trunk is one of the most abundant lignocellulosic wastes by-products worldwide especially come from Malaysia, Indonesia and Nigeria and provides an alternative substrate to produce useful chemicals such as bioethanol. Usually, from the ages 3 years to 25 years, is the economical life of oil palm and after that, it is cut for replantation. The size of trunk usually is 15-18 meters in length and 46-60 centimeters in diameter. The trunk after cutting is agricultural waste causing problem in elimination but due to the trunk contains about 42% cellulose, 34.4%hemicellulose, 17.1% lignin and 7.3% other compounds,these agricultural wastes could make value added products (Pumiput, 2006).This research was production of bioethanol from oil palm trunk via saccharafication by cocktail carbohydrases enzymes. Enzymatic saccharification of acid treated oil palm trunk was carried out in reaction mixture containing 40 g treated oil palm trunk in 200 ml 0.1 M citrate buffer pH 4.8 with 500 unit/kg amylase for treatment A: Treatment B: Treatment A + 500 unit/kg cellulose; C: treatment B + 500 unit/kgg xylanase: D: treatment D + 500 unit/kg ligninase and E: OPT without treated + 500 unit/kg amylase + 500 unit/kg cellulose + 500 unit/kg xylanase + 500 unit/kg ligninase. The reaction mixture was incubated on a water bath rotary shaker adjusted to 600C and 75 rpm. The samples were withdraw at intervals 12 and 24, 36, 48,60, and 72 hr. For bioethanol production in biofermentor of 5L the hydrolysis product were inoculated a loop of Saccharomyces cerevisiae and then incubated at 34 0C under static conditions. Samples are withdraw after 12, 24, 36, 48 and 72 hr for bioethanol and residual glucose. The results of the enzymatic hidrolysis (Figure1) showed that the treatment B (OPT hydrolyzed with amylase and cellulase) have optimum condition for glucose production, where was both of enzymes can be degraded OPT perfectly. The same results also reported by Primarini et al., (2012) reported the optimum conditions the hydrolysis of OPT was at concentration of 25% (w /v) with 0.3% (w/v) amylase, 0.6% (w /v) glucoamylase and 4% (w/v) cellulase. In the Figure 2 showed that optimum bioethanol produced at 48 hr after incubation,if time increased the biothanol decreased. According Roukas (1996), a decrease in the concentration of ethanol occur at excess glucose as substrate and product inhibition effects. Substrate concentration is too high reduces the amount of dissolved oxygen, although in very small amounts, oxygen is still needed in the fermentation by Saccaromyces cerevisiae to keep life in high cell concentrations (Nowak 2000, Tao et al. 2005). The results of the research can be conluded that the optimum enzymatic hydrolysis occured when the OPT added with amylase and cellulase and optimum bioethanol produced at 48 hr incubation using Saccharomyses cerevicea whereas 18.08 % bioethanol produced from glucose conversion. This work was funded by Directorate General of Higher Education (DGHE), Ministry of Education and Culture, contract no.245/SP2H/DIT.LimtabMas/II/2013

Keywords: oil palm trunk, enzymatic hydrolysis, saccharification

Procedia PDF Downloads 514
1582 Optimization of a Four-Lobed Swirl Pipe for Clean-In-Place Procedures

Authors: Guozhen Li, Philip Hall, Nick Miles, Tao Wu

Abstract:

This paper presents a numerical investigation of two horizontally mounted four-lobed swirl pipes in terms of swirl induction effectiveness into flows passing through them. The swirl flows induced by the two swirl pipes have the potential to improve the efficiency of Clean-In-Place procedures in a closed processing system by local intensification of hydrodynamic impact on the internal pipe surface. Pressure losses, swirl development within the two swirl pipe, swirl induction effectiveness, swirl decay and wall shear stress variation downstream of two swirl pipes are analyzed and compared. It was found that a shorter length of swirl inducing pipe used in joint with transition pipes is more effective in swirl induction than when a longer one is used, in that it has a less constraint to the induced swirl and results in slightly higher swirl intensity just downstream of it with the expense of a smaller pressure loss. The wall shear stress downstream of the shorter swirl pipe is also slightly larger than that downstream of the longer swirl pipe due to the slightly higher swirl intensity induced by the shorter swirl pipe. The advantage of the shorter swirl pipe in terms of swirl induction is more significant in flows with a larger Reynolds Number.

Keywords: swirl pipe, swirl effectiveness, CFD, wall shear stress, swirl intensity

Procedia PDF Downloads 606
1581 AI Software Algorithms for Drivers Monitoring within Vehicles Traffic - SiaMOTO

Authors: Ioan Corneliu Salisteanu, Valentin Dogaru Ulieru, Mihaita Nicolae Ardeleanu, Alin Pohoata, Bogdan Salisteanu, Stefan Broscareanu

Abstract:

Creating a personalized statistic for an individual within the population using IT systems, based on the searches and intercepted spheres of interest they manifest, is just one 'atom' of the artificial intelligence analysis network. However, having the ability to generate statistics based on individual data intercepted from large demographic areas leads to reasoning like that issued by a human mind with global strategic ambitions. The DiaMOTO device is a technical sensory system that allows the interception of car events caused by a driver, positioning them in time and space. The device's connection to the vehicle allows the creation of a source of data whose analysis can create psychological, behavioural profiles of the drivers involved. The SiaMOTO system collects data from many vehicles equipped with DiaMOTO, driven by many different drivers with a unique fingerprint in their approach to driving. In this paper, we aimed to explain the software infrastructure of the SiaMOTO system, a system designed to monitor and improve driver driving behaviour, as well as the criteria and algorithms underlying the intelligent analysis process.

Keywords: artificial intelligence, data processing, driver behaviour, driver monitoring, SiaMOTO

Procedia PDF Downloads 91
1580 Tool Wear Analysis in 3D Manufactured Ti6AI4V

Authors: David Downey

Abstract:

With the introduction of additive manufacturing (3D printing) to produce titanium (Ti6Al4V) components in the medical/aerospace and automotive industries, intricate geometries can be produced with virtually complete design freedom. However, the consideration of microstructural anisotropy resulting from the additive manufacturing process becomes necessary due to this design flexibility and the need to print a geometric shape that can consist of numerous angles, radii, and swept surfaces. A femoral knee implant serves as an example of a 3D-printed near-net-shaped product. The mechanical properties of the printed components, and consequently, their machinability, are affected by microstructural anisotropy. Currently, finish-machining operations performed on titanium printed parts using selective laser melting (SLM) utilize the same cutting tools employed for processing wrought titanium components. Cutting forces for components manufactured through SLM can be up to 70% higher than those for their wrought counterparts made of Ti6Al4V. Moreover, temperatures at the cutting interface of 3D printed material can surpass those of wrought titanium, leading to significant tool wear. Although the criteria for tool wear may be similar for both 3D printed and wrought materials, the rate of wear during the machining process may differ. The impact of these issues on the choice of cutting tool material and tool lifetimes will be discussed.

Keywords: additive manufacturing, build orientation, microstructural anisotropy, printed titanium Ti6Al4V, tool wear

Procedia PDF Downloads 91
1579 Bioremediation of Sea Food Waste in Solid State Fermentation along with Production of Bioactive Agents

Authors: Rahul Warmoota, Aditya Bhardwaj, Steffy Angural, Monika Rana, Sunena Jassal, Neena Puri, Naveen Gupta

Abstract:

Seafood processing generates large volumes of waste products such as skin, heads, tails, shells, scales, backbones, etc. Pollution due to conventional methods of seafood waste disposal causes negative implications on the environment, aquatic life, and human health. Moreover, these waste products can be used for the production of high-value products which are still untapped due to inappropriate management. Paenibacillus sp. AD is known to act on chitinolytic and proteinaceous waste and was explored for its potential to degrade various types of seafood waste in solid-state fermentation. Effective degradation of seafood waste generated from a variety of sources such as fish scales, crab shells, prawn shells, and a mixture of such wastes was observed. 30 to 40 percent degradation in terms of decrease in the mass was achieved. Along with the degradation, chitinolytic and proteolytic enzymes were produced, which can have various biotechnological applications. Apart from this, value-added products such as chitin oligosaccharides and peptides of various degrees of polymerization were also produced, which can be used for various therapeutic purposes. Results indicated that Paenibacillus sp. AD can be used for the development of a process for the infield degradation of seafood waste.

Keywords: chitin, chitin-oligosaccharides, chitinase, protease, biodegradation, crab shells, prawn shells, fish scales

Procedia PDF Downloads 98
1578 Design and Experimental Studies of a Centrifugal SWIRL Atomizer

Authors: Hemabushan K., Manikandan

Abstract:

In a swirl atomizer, fluid undergoes a swirling motion as a result of centrifugal force created by opposed tangential inlets in the swirl chamber. The angular momentum of fluid continually increases as it reaches the exit orifice and forms a hollow sheet. Which disintegrates to form ligaments and droplets respectively as it flows downstream. This type of atomizers used in rocket injectors and oil burner furnaces. In this present investigation a swirl atomizer with two opposed tangential inlets has been designed. Water as working fluid, experiments had been conducted for the fluid injection pressures in regime of 0.033 bar to 0.519 bar. The fluid has been pressured by a 0.5hp pump and regulated by a pressure regulator valve. Injection pressure of fluid has been measured by a U-tube mercury manometer. The spray pattern and the droplets has been captured with a high resolution camera in black background with a high intensity flash highlighting the fluid. The unprocessed images were processed in ImageJ processing software for measuring the droplet diameters and its shape characteristics along the downstream. The parameters such as mean droplet diameter and distribution, wave pattern, rupture distance and spray angle were studied for this atomizer. The above results were compared with theoretical results and also analysed for deviation with design parameters.

Keywords: swirl atomizer, injector, spray, SWIRL

Procedia PDF Downloads 490
1577 A Hybrid Feature Selection and Deep Learning Algorithm for Cancer Disease Classification

Authors: Niousha Bagheri Khulenjani, Mohammad Saniee Abadeh

Abstract:

Learning from very big datasets is a significant problem for most present data mining and machine learning algorithms. MicroRNA (miRNA) is one of the important big genomic and non-coding datasets presenting the genome sequences. In this paper, a hybrid method for the classification of the miRNA data is proposed. Due to the variety of cancers and high number of genes, analyzing the miRNA dataset has been a challenging problem for researchers. The number of features corresponding to the number of samples is high and the data suffer from being imbalanced. The feature selection method has been used to select features having more ability to distinguish classes and eliminating obscures features. Afterward, a Convolutional Neural Network (CNN) classifier for classification of cancer types is utilized, which employs a Genetic Algorithm to highlight optimized hyper-parameters of CNN. In order to make the process of classification by CNN faster, Graphics Processing Unit (GPU) is recommended for calculating the mathematic equation in a parallel way. The proposed method is tested on a real-world dataset with 8,129 patients, 29 different types of tumors, and 1,046 miRNA biomarkers, taken from The Cancer Genome Atlas (TCGA) database.

Keywords: cancer classification, feature selection, deep learning, genetic algorithm

Procedia PDF Downloads 111
1576 Colour Quick Response Code with High Damage Resistance Capability

Authors: Minh Nguyen

Abstract:

Today, QR or Quick Response Codes are prevalent, and mobile/smart devices can efficiently read and understand them. Therefore, we can see their appearance in many areas, such as storing web pages/websites, business phone numbers, redirecting to an app download, business location, social media. The popularity of the QR Code is mainly because of its many advantages, such as it can hold a good amount of information, is small, easy to scan and read by a general RGB camera, and it can still work with some damages on its surface. However, there are still some issues. For instance, some areas needed to be kept untouched for its successful decode (e.g., the “Finder Patterns,” the “Quiet Zone,” etc.), the capability of built-in auto-correction is not robust enough, and it is not flexible enough for many application such as Augment Reality (AR). We proposed a new Colour Quick Response Code that has several advantages over the original ones: (1) there is no untouchable area, (2) it allows up to 40% of the entire code area to be damaged, (3) it is more beneficial for Augmented Reality applications, and (4) it is back-compatible and readable by available QR Code scanners such as Pyzbar. From our experience, our Colour Quick Response Code is significantly more flexible on damage compared to the original QR Code. Our code is believed to be suitable in situations where standard 2D Barcodes fail to work, such as curved and shiny surfaces, for instance, medical blood test sample tubes and syringes.

Keywords: QR code, computer vision, image processing, 2D barcode

Procedia PDF Downloads 118
1575 Simulation and Experimental Research on Pocketing Operation for Toolpath Optimization in CNC Milling

Authors: Rakesh Prajapati, Purvik Patel, Avadhoot Rajurkar

Abstract:

Nowadays, manufacturing industries augment their production lines with modern machining centers backed by CAM software. Several attempts are being made to cut down the programming time for machining complex geometries. Special programs/software have been developed to generate the digital numerical data and to prepare NC programs by using suitable post-processors for different machines. By selecting the tools and manufacturing process then applying tool paths and NC program are generated. More and more complex mechanical parts that earlier were being cast and assembled/manufactured by other processes are now being machined. Majority of these parts require lots of pocketing operations and find their applications in die and mold, turbo machinery, aircraft, nuclear, defense etc. Pocketing operations involve removal of large quantity of material from the metal surface. The modeling of warm cast and clamping a piece of food processing parts which the used of Pro-E and MasterCAM® software. Pocketing operation has been specifically chosen for toolpath optimization. Then after apply Pocketing toolpath, Multi Tool Selection and Reduce Air Time give the results of software simulation time and experimental machining time.

Keywords: toolpath, part program, optimization, pocket

Procedia PDF Downloads 287
1574 An Application for Risk of Crime Prediction Using Machine Learning

Authors: Luis Fonseca, Filipe Cabral Pinto, Susana Sargento

Abstract:

The increase of the world population, especially in large urban centers, has resulted in new challenges particularly with the control and optimization of public safety. Thus, in the present work, a solution is proposed for the prediction of criminal occurrences in a city based on historical data of incidents and demographic information. The entire research and implementation will be presented start with the data collection from its original source, the treatment and transformations applied to them, choice and the evaluation and implementation of the Machine Learning model up to the application layer. Classification models will be implemented to predict criminal risk for a given time interval and location. Machine Learning algorithms such as Random Forest, Neural Networks, K-Nearest Neighbors and Logistic Regression will be used to predict occurrences, and their performance will be compared according to the data processing and transformation used. The results show that the use of Machine Learning techniques helps to anticipate criminal occurrences, which contributed to the reinforcement of public security. Finally, the models were implemented on a platform that will provide an API to enable other entities to make requests for predictions in real-time. An application will also be presented where it is possible to show criminal predictions visually.

Keywords: crime prediction, machine learning, public safety, smart city

Procedia PDF Downloads 111
1573 Development of a Shape Based Estimation Technology Using Terrestrial Laser Scanning

Authors: Gichun Cha, Byoungjoon Yu, Jihwan Park, Minsoo Park, Junghyun Im, Sehwan Park, Sujung Sin, Seunghee Park

Abstract:

The goal of this research is to estimate a structural shape change using terrestrial laser scanning. This study proceeds with development of data reduction and shape change estimation algorithm for large-capacity scan data. The point cloud of scan data was converted to voxel and sampled. Technique of shape estimation is studied to detect changes in structure patterns, such as skyscrapers, bridges, and tunnels based on large point cloud data. The point cloud analysis applies the octree data structure to speed up the post-processing process for change detection. The point cloud data is the relative representative value of shape information, and it used as a model for detecting point cloud changes in a data structure. Shape estimation model is to develop a technology that can detect not only normal but also immediate structural changes in the event of disasters such as earthquakes, typhoons, and fires, thereby preventing major accidents caused by aging and disasters. The study will be expected to improve the efficiency of structural health monitoring and maintenance.

Keywords: terrestrial laser scanning, point cloud, shape information model, displacement measurement

Procedia PDF Downloads 234
1572 Sensor Monitoring of the Concentrations of Different Gases Present in Synthesis of Ammonia Based on Multi-Scale Entropy and Multivariate Statistics

Authors: S. Aouabdi, M. Taibi

Abstract:

The supervision of chemical processes is the subject of increased development because of the increasing demands on reliability and safety. An important aspect of the safe operation of chemical process is the earlier detection of (process faults or other special events) and the location and removal of the factors causing such events, than is possible by conventional limit and trend checks. With the aid of process models, estimation and decision methods it is possible to also monitor hundreds of variables in a single operating unit, and these variables may be recorded hundreds or thousands of times per day. In the absence of appropriate processing method, only limited information can be extracted from these data. Hence, a tool is required that can project the high-dimensional process space into a low-dimensional space amenable to direct visualization, and that can also identify key variables and important features of the data. Our contribution based on powerful techniques for development of a new monitoring method based on multi-scale entropy MSE in order to characterize the behaviour of the concentrations of different gases present in synthesis and soft sensor based on PCA is applied to estimate these variables.

Keywords: ammonia synthesis, concentrations of different gases, soft sensor, multi-scale entropy, multivarite statistics

Procedia PDF Downloads 336
1571 Biological Activity of Bilberry Pomace

Authors: Gordana S. Ćetković, Vesna T. Tumbas Šaponjac, Sonja M. Djilas, Jasna M. Čanadanović-Brunet, Sladjana M. Stajčić, Jelena J. Vulić

Abstract:

Bilberry is one of the most important dietary sources of phenolic compounds, including anthocyanins, phenolic acids, flavonol glycosides and flavan-3-ols. These phytochemicals have different biological activities and therefore may improve our health condition. Also, anthocyanins are interesting to the food industry as colourants. In the present study, bilberry pomace, a by-product of juice processing, was used as a potential source of bioactive compounds. The contents of total phenolic acids, flavonoids and anthocyanins in bilberry pomace were determined by HPLC/UV-Vis. The biological activities of bilberry pomace were evaluated by reducing power (RP) and α-glucosidase inhibitory potential (α-GIP), and expressed as RP0.5 value (the effective concentration of bilberry pomace extract assigned at 0.5 value of absorption) and IC50 value (the concentration of bilberry pomace extract necessary to inhibit 50% of α-glucosidase enzyme activity). Total phenolic acids content was 807.12 ± 25.16 mg/100 g pomace, flavonoids 54.36 ± 1.83mg/100 g pomace and anthocyanins 3426.18 ± 112.09 mg/100 g pomace. The RP0.5 value of bilberry pomace was 0.38 ± 0.02 mg/ml, while IC50 value was 1.82 ± 0.11 mg/ml. These results have revealed the potential for valorization of bilberry juice production by-products for further industrial use as a rich source of bioactive compounds and natural colourants (mainly anthocyanins).

Keywords: bilberry pomace, phenolics, antioxidant activity, reducing power, α-glucosidase enzyme activity

Procedia PDF Downloads 599
1570 AI and the Future of Misinformation: Opportunities and Challenges

Authors: Noor Azwa Azreen Binti Abd. Aziz, Muhamad Zaim Bin Mohd Rozi

Abstract:

Moving towards the 4th Industrial Revolution, artificial intelligence (AI) is now more popular than ever. This subject is gaining significance every day and is continually expanding, often merging with other fields. Instead of merely being passive observers, there are benefits to understanding modern technology by delving into its inner workings. However, in a world teeming with digital information, the impact of AI on the spread of disinformation has garnered significant attention. The dissemination of inaccurate or misleading information is referred to as misinformation, posing a serious threat to democratic society, public debate, and individual decision-making. This article delves deep into the connection between AI and the dissemination of false information, exploring its potential, risks, and ethical issues as AI technology advances. The rise of AI has ushered in a new era in the dissemination of misinformation as AI-driven technologies are increasingly responsible for curating, recommending, and amplifying information on online platforms. While AI holds the potential to enhance the detection and mitigation of misinformation through natural language processing and machine learning, it also raises concerns about the amplification and propagation of false information. AI-powered deepfake technology, for instance, can generate hyper-realistic videos and audio recordings, making it increasingly challenging to discern fact from fiction.

Keywords: artificial intelligence, digital information, disinformation, ethical issues, misinformation

Procedia PDF Downloads 91
1569 Correlation between Funding and Publications: A Pre-Step towards Future Research Prediction

Authors: Ning Kang, Marius Doornenbal

Abstract:

Funding is a very important – if not crucial – resource for research projects. Usually, funding organizations will publish a description of the funded research to describe the scope of the funding award. Logically, we would expect research outcomes to align with this funding award. For that reason, we might be able to predict future research topics based on present funding award data. That said, it remains to be shown if and how future research topics can be predicted by using the funding information. In this paper, we extract funding project information and their generated paper abstracts from the Gateway to Research database as a group, and use the papers from the same domains and publication years in the Scopus database as a baseline comparison group. We annotate both the project awards and the papers resulting from the funded projects with linguistic features (noun phrases), and then calculate tf-idf and cosine similarity between these two set of features. We show that the cosine similarity between the project-generated papers group is bigger than the project-baseline group, and also that these two groups of similarities are significantly different. Based on this result, we conclude that the funding information actually correlates with the content of future research output for the funded project on the topical level. How funding really changes the course of science or of scientific careers remains an elusive question.

Keywords: natural language processing, noun phrase, tf-idf, cosine similarity

Procedia PDF Downloads 245