Search results for: rice processing
1268 Comparison of Polyphonic Profile of a Berry from Two Different Sources, Using an Optimized Extraction Method
Authors: G. Torabian, A. Fathi, P. Valtchev, F. Dehghani
Abstract:
The superior polyphenol content of Sambucus nigra berries has high health potentials for the production of nutraceutical products. Numerous factors influence the polyphenol content of the final products including the berries’ source and the subsequent processing production steps. The aim of this study is to compare the polyphenol content of berries from two different sources and also to optimise the polyphenol extraction process from elderberries. Berries from source B obtained more acceptable physical properties than source A; a single berry from source B was double in size and weight (both wet and dry weight) compared with a source A berry. Despite the appropriate physical characteristics of source B berries, their polyphenolic profile was inferior; as source A berries had 2.3 fold higher total anthocyanin content, and nearly two times greater total phenolic content and total flavonoid content compared to source B. Moreover, the result of this study showed that almost 50 percent of the phenolic content of berries are entrapped within their skin and pulp that potentially cannot be extracted by press juicing. To address this challenge and to increase the total polyphenol yield of the extract, we used cold-shock blade grinding method to break the cell walls. The result of this study showed that using cultivars with higher phenolic content as well as using the whole fruit including juice, skin and pulp can increase polyphenol yield significantly; and thus, may boost the potential of using elderberries as therapeutic products.Keywords: different sources, elderberry, grinding, juicing, polyphenols
Procedia PDF Downloads 2941267 Women Empowerment in Cassava Production: A Case Study of Southwest Nigeria
Authors: Adepoju A. A., Olapade-Ogunwole F., Ganiyu M. O.
Abstract:
This study examined women's empowerment in cassava production in southwest Nigeria. The contributions of the five domains namely decision about agricultural production, decision-making power over productive resources, control of the use of income, leadership and time allocation to women disempowerment, profiled the women based on their socio-economics features and determined factors influencing women's disempowerment. Primary data were collected from the women farmers and processors through the use of structured questionnaires. Purposive sampling was used to select the LGAs and villages based on a large number of cassava farmers and processors, while cluster sampling was used to select 360 respondents in the study area. Descriptive statistics such as bar charts and percentages, Women Empowerment in Agriculture (WEAI), and the Logit regression model were used to analyze the data collected. The results revealed that 63.88% of the women were disempowered. Lack of decision-making power over productive resources; 36.47% and leadership skills; 33.26% contributed mostly to the disempowerment of the women. About 85% of the married women were disempowered, while 76.92% of the women who participated in social group activities were more empowered than their disempowered counterparts. The findings showed that women with more years of processing experience have the probability of being disempowered while those who engage in farming as a primary livelihood activity, and participate in social groups among others have the tendency to be empowered. In view of this, it was recommended that women should be encouraged to farm and contribute to social group activities.Keywords: cassava, production, empowerment, southwest, Nigeria
Procedia PDF Downloads 581266 Increasing Efficiency of Own Used Fuel Gas by “LOTION” Method in Generating Systems PT. Pertamina EP Cepu Donggi Matindok Field in Central Sulawesi Province, Indonesia
Authors: Ridwan Kiay Demak, Firmansyahrullah, Muchammad Sibro Mulis, Eko Tri Wasisto, Nixon Poltak Frederic, Agung Putu Andika, Lapo Ajis Kamamu, Muhammad Sobirin, Kornelius Eppang
Abstract:
PC Prove LSM successfully improved the efficiency of Own Used Fuel Gas with the "Lotion" method in the PT Pertamina EP Cepu Donggi Matindok Generating System. The innovation of using the "LOTION" (LOAD PRIORITY SELECTION) method in the generating system is modeling that can provide a priority qualification of main and non-main equipment to keep gas processing running even though it leaves 1 GTG operating. GTG operating system has been integrated, controlled, and monitored properly through PC programs and web-based access to answer Industry 4.0 problems. The results of these improvements have succeeded in making Donggi Matindok Field Production reach 98.77 MMSCFD and become a proper EMAS candidate in 2022-2023. Additional revenue from increasing the efficiency of the use of own used gas amounting to USD USD 5.06 Million per year and reducing operational costs from maintenance efficiency (ABO) due to saving running hours GTG amounted to USD 3.26 Million per year. Continuity of fuel gas availability for the GTG generation system can maintain the operational reliability of the plant, which is 3.833333 MMSCFD. And reduced gas emissions wasted to the environment by 33,810 tons of C02 eq per year.Keywords: LOTION method, load priority selection, fuel gas efficiency, gas turbine generator, reduce emissions
Procedia PDF Downloads 591265 Measurement of Solids Concentration in Hydrocyclone Using ERT: Validation Against CFD
Authors: Vakamalla Teja Reddy, Narasimha Mangadoddy
Abstract:
Hydrocyclones are used to separate particles into different size fractions in the mineral processing, chemical and metallurgical industries. High speed video imaging, Laser Doppler Anemometry (LDA), X-ray and Gamma ray tomography are previously used to measure the two-phase flow characteristics in the cyclone. However, investigation of solids flow characteristics inside the cyclone is often impeded by the nature of the process due to slurry opaqueness and solid metal wall vessels. In this work, a dual-plane high speed Electrical resistance tomography (ERT) is used to measure hydrocyclone internal flow dynamics in situ. Experiments are carried out in 3 inch hydrocyclone for feed solid concentrations varying in the range of 0-50%. ERT data analysis through the optimized FEM mesh size and reconstruction algorithms on air-core and solid concentration tomograms is assessed. Results are presented in terms of the air-core diameter and solids volume fraction contours using Maxwell’s equation for various hydrocyclone operational parameters. It is confirmed by ERT that the air core occupied area and wall solids conductivity levels decreases with increasing the feed solids concentration. Algebraic slip mixture based multi-phase computational fluid dynamics (CFD) model is used to predict the air-core size and the solid concentrations in the hydrocyclone. Validation of air-core size and mean solid volume fractions by ERT measurements with the CFD simulations is attempted.Keywords: air-core, electrical resistance tomography, hydrocyclone, multi-phase CFD
Procedia PDF Downloads 3791264 A Study to Evaluate Some Physical and Mechanical Properties, Relevant in Estimating Energy Requirements in Grinding the Palm Kernel and Coconut Shells
Authors: Saheed O. Akinwale, Olufemi A. Koya
Abstract:
Based on the need to modify palm kernel shell (PKS) and coconut shell (CNS) for some engineering applications, the study evaluated some physical characteristics and fracture resistance, relevant in estimating energy requirements in comminution of the nutshells. The shells, obtained from local processing mills, were washed, sun-dried and sorted to remove kernels, nuts and other extraneous materials. Experiments were then conducted to determine the thickness, density, moisture content, and hardness of the shells. Fracture resistances were characterised by the average compressive load, stiffness and toughness at bio-yield point of specially prepared section of the shells, under quasi-static compression loading. The densities of the dried PKS at 7.12% and the CNS at 6.47% (wb) moisture contents were 1291.20 and 1247.40 kg/m3, respectively. The corresponding Brinnel Hardness Numbers were 58.40 ± 1.91 and 56.33 ± 4.33. Close shells thickness of both PKS and CNS exhibited identical physical properties although; CNS is relatively larger in physical dimensions than PKS. The findings further showed that both shell types exhibited higher resistance with compression along the longitudinal axes than the transverse axes. With compressions along the longitudinal axes, the fracture force were 1.41 ± 0.11 and 3.62 ± 0.09 kN; bio-stiffness; 934.70 ± 67.03 kN/m and 1980.74 ± 8.92 kN/m; and toughness, 2.17 ± 0.16 and 6.51 ± 0.15 KN mm for the PKS and CNS, respectively. With the estimated toughness of CNS higher than that of PKS, the study showed the requirement of higher comminution energy for CNS.Keywords: bio-stiffness, coconut shell, comminution, crushing strength, energy requirement, palm kernel shell, toughness
Procedia PDF Downloads 2321263 Glutharaldyde Free Processing of Patch for Cardiovascular Repair Is Associated with Improved Outcomes on Rvot Repair, Rat Model
Authors: Parnaz Boodagh, Danila Vella, Antonio Damore, Laura Modica De Mohac, Sang-Ho Ye, Garret Coyan, Gaetano Burriesci, William Wagner, Federica Cosentino
Abstract:
The use of cardiac patches is among the main therapeutic solution for cardiovascular diseases, a leading mortality cause in the world with an increasing trend, responsible of 19 millions deaths in 2020. Several classes of biomaterials serve that purpose, both of synthetic origin and biological derivation, and many bioengineered treatment alternatives were proposed to satisfy two main requirements, providing structural support and promoting tissue remodeling. The objective of this paper is to compare the mechanical properties and the characterization of four cardiac patches: the Adeka, PhotoFix, CorPatch, and CardioCel patches. In vitro and in vivo tests included: biaxial, uniaxial, ball burst, suture retention for mechanical characterization; 2D surface topography, 3D volume and microstructure, and histology assessments for structure characterization; in vitro test to evaluate platelet deposition, calcium deposition, and macrophage polarization; rat right ventricular outflow tract (RVOT) models at 8- and 16-week time points to characterize the patch-host interaction. Lastly, the four patches were used to produce four stented aortic valve prosthesis, subjected to hydrodynamic assessment as well as durability testing to verify compliance with the standard ISO.Keywords: cardiac patch, cardiovascular disease, cardiac repair, blood contact biomaterial
Procedia PDF Downloads 1531262 Presence of High Concentrations of Toxic Metals from the Collected Soil Samples Due to Excessive E-Waste Burning in the Various Areas of Moradabad City, U.P India
Authors: Aprajita Singh, Anamika Tripathi, Surya P. Dwivedi
Abstract:
Moradabad is a small town in the Northern area of Uttar Pradesh, India. It is situated on the bank of river Ramganga which is also known as ‘Brass City of India’. There is eventually increase in the environmental pollution due to uncontrolled and inappropriate e-waste burning (recycling) activities which have been reported in many areas of Moradabad. In this paper, analysis of toxic heavy metals, causing pollution to the surrounding environment released from the e-waste burning and much other recycling process. All major e-waste burning sites are situated on the banks of the river which is burned in open environmental conditions. Soil samples were collected from seven (n=3) different sites including control site, after digestion of soil samples using triacid mixture, analysis of different toxic metals (Pb, Ar, Hg, Cd, Cr, Cu, Zn, Fe, and Ni) has been carried out with the help of instrument ICP-AAS. After the study, the outcome is that the soil of those areas contains a relatively high level of the toxic metals in order of Cu>Fe>Pb>Cd>Cr>Zn>Ar>Hg. The concentration of Cd, Pb, Cr, Ar and Zn (the majority of samples experimentally proved) exceeded the maximum standard level of WHO. Sequentially this study showed that uncontrolled e-waste processing operations caused serious pollution to local soil and release of toxic metals in the environment is also causing adverse effect on the health of people living in the nearby areas making them more prone to various harmful diseases.Keywords: brass city, environment pollution, e-waste, toxic heavy metals
Procedia PDF Downloads 3001261 Heat Treatment of Additively Manufactured Hybrid Rocket Fuel Grains
Authors: Jim J. Catina, Jackee M. Gwynn, Jin S. Kang
Abstract:
Additive manufacturing (AM) for hybrid rocket engines is becoming increasingly attractive due to its ability to create complex grain configurations with improved regression rates when compared to cast grains. However, the presence of microvoids in parts produced through the additive manufacturing method of Fused Deposition Modeling (FDM) results in a lower fuel density and is believed to cause a decrease in regression rate compared to ideal performance. In this experiment, FDM was used to create hybrid rocket fuel grains with a star configuration composed of acrylonitrile butadiene styrene (ABS). Testing was completed to determine the effect of heat treatment as a post-processing method to improve the combustion performance of hybrid rocket fuel grains manufactured by FDM. For control, three ABS star configuration grains were printed using FDM and hot fired using gaseous oxygen (GOX) as the oxidizer. Parameters such as thrust and mass flow rate were measured. Three identical grains were then heat treated to varying degrees and hot fired under the same conditions as the control grains. This paper will quantitatively describe the amount of improvement in engine performance as a result of heat treatment of the AM hybrid fuel grain. Engine performance is measured in this paper by specific impulse, which is determined from the thrust measurements collected in testing.Keywords: acrylonitrile butadiene styrene, additive manufacturing, fused deposition modeling, heat treatment
Procedia PDF Downloads 1171260 Large Scale Production of Polyhydroxyalkanoates (PHAs) from Waste Water: A Study of Techno-Economics, Energy Use, and Greenhouse Gas Emissions
Authors: Cora Fernandez Dacosta, John A. Posada, Andrea Ramirez
Abstract:
The biodegradable family of polymers polyhydroxyalkanoates are interesting substitutes for convectional fossil-based plastics. However, the manufacturing and environmental impacts associated with their production via intracellular bacterial fermentation are strongly dependent on the raw material used and on energy consumption during the extraction process, limiting their potential for commercialization. Industrial wastewater is studied in this paper as a promising alternative feedstock for waste valorization. Based on results from laboratory and pilot-scale experiments, a conceptual process design, techno-economic analysis and life cycle assessment are developed for the large-scale production of the most common type of polyhydroxyalkanoate, polyhydroxbutyrate. Intracellular polyhydroxybutyrate is obtained via fermentation of microbial community present in industrial wastewater and the downstream processing is based on chemical digestion with surfactant and hypochlorite. The economic potential and environmental performance results help identifying bottlenecks and best opportunities to scale-up the process prior to industrial implementation. The outcome of this research indicates that the fermentation of wastewater towards PHB presents advantages compared to traditional PHAs production from sugars because the null environmental burdens and financial costs of the raw material in the bioplastic production process. Nevertheless, process optimization is still required to compete with the petrochemicals counterparts.Keywords: circular economy, life cycle assessment, polyhydroxyalkanoates, waste valorization
Procedia PDF Downloads 4571259 Monitoring the Drying and Grinding Process during Production of Celitement through a NIR-Spectroscopy Based Approach
Authors: Carolin Lutz, Jörg Matthes, Patrick Waibel, Ulrich Precht, Krassimir Garbev, Günter Beuchle, Uwe Schweike, Peter Stemmermann, Hubert B. Keller
Abstract:
Online measurement of the product quality is a challenging task in cement production, especially in the production of Celitement, a novel environmentally friendly hydraulic binder. The mineralogy and chemical composition of clinker in ordinary Portland cement production is measured by X-ray diffraction (XRD) and X ray fluorescence (XRF), where only crystalline constituents can be detected. But only a small part of the Celitement components can be measured via XRD, because most constituents have an amorphous structure. This paper describes the development of algorithms suitable for an on-line monitoring of the final processing step of Celitement based on NIR-data. For calibration intermediate products were dried at different temperatures and ground for variable durations. The products were analyzed using XRD and thermogravimetric analyses together with NIR-spectroscopy to investigate the dependency between the drying and the milling processes on one and the NIR-signal on the other side. As a result, different characteristic parameters have been defined. A short overview of the Celitement process and the challenging tasks of the online measurement and evaluation of the product quality will be presented. Subsequently, methods for systematic development of near-infrared calibration models and the determination of the final calibration model will be introduced. The application of the model on experimental data illustrates that NIR-spectroscopy allows for a quick and sufficiently exact determination of crucial process parameters.Keywords: calibration model, celitement, cementitious material, NIR spectroscopy
Procedia PDF Downloads 5001258 Thermochemical Modelling for Extraction of Lithium from Spodumene and Prediction of Promising Reagents for the Roasting Process
Authors: Allen Yushark Fosu, Ndue Kanari, James Vaughan, Alexandre Changes
Abstract:
Spodumene is a lithium-bearing mineral of great interest due to increasing demand of lithium in emerging electric and hybrid vehicles. The conventional method of processing the mineral for the metal requires inevitable thermal transformation of α-phase to the β-phase followed by roasting with suitable reagents to produce lithium salts for downstream processes. The selection of appropriate reagent for roasting is key for the success of the process and overall lithium recovery. Several researches have been conducted to identify good reagents for the process efficiency, leading to sulfation, alkaline, chlorination, fluorination, and carbonizing as the methods of lithium recovery from the mineral.HSC Chemistry is a thermochemical software that can be used to model metallurgical process feasibility and predict possible reaction products prior to experimental investigation. The software was employed to investigate and explain the various reagent characteristics as employed in literature during spodumene roasting up to 1200°C. The simulation indicated that all used reagents for sulfation and alkaline were feasible in the direction of lithium salt production. Chlorination was only feasible when Cl2 and CaCl2 were used as chlorination agents but not NaCl nor KCl. Depending on the kind of lithium salt formed during carbonizing and fluorination, the process was either spontaneous or nonspontaneous throughout the temperature range investigated. The HSC software was further used to simulate and predict some promising reagents which may be equally good for roasting the mineral for efficient lithium extraction but have not yet been considered by researchers.Keywords: thermochemical modelling, HSC chemistry software, lithium, spodumene, roasting
Procedia PDF Downloads 1591257 Exploring Subjective Simultaneous Mixed Emotion Experiences in Middle Childhood
Authors: Esther Burkitt
Abstract:
Background: Evidence is mounting that mixed emotions can be experienced simultaneously in different ways across the lifespan. Four types of patterns of simultaneously mixed emotions (sequential, prevalent, highly parallel, and inverse types) have been identified in middle childhood and adolescence. Moreover, the recognition of these experiences tends to develop firstly when children consider peers rather than the self. This evidence from children and adolescents is based on examining the presence of experiences specified in adulthood. The present study, therefore, applied an exhaustive coding scheme to investigate whether children experience types of previously unidentified simultaneous mixed emotional experiences. Methodology: One hundred and twenty children (60 girls) aged 7 years 1 month - 9 years 2 months (X=8 years 1 month; SD = 10 months) were recruited from mainstream schools across the UK. Two age groups were formed (youngest, n = 61, 7 years 1 month- 8 years 1 months: oldest, n = 59, 8 years 2 months – 9 years 2 months) and allocated to one of two conditions hearing vignettes describing happy and sad mixed emotion events in age and gender-matched protagonist or themselves. Results: Loglinear analyses identified new types of flexuous, vertical, and other experiences along with established sequential, prevalent, highly parallel, and inverse types of experience. Older children recognised more complex experiences other than the self-condition. Conclusion: Several additional types of simultaneously mixed emotions are recognised in middle childhood. The theoretical relevance of simultaneous mixed emotion processing in childhood is considered, and the potential utility of the findings in emotion assessments is discussed.Keywords: emotion, childhood, self, other
Procedia PDF Downloads 781256 Graph-Based Semantical Extractive Text Analysis
Authors: Mina Samizadeh
Abstract:
In the past few decades, there has been an explosion in the amount of available data produced from various sources with different topics. The availability of this enormous data necessitates us to adopt effective computational tools to explore the data. This leads to an intense growing interest in the research community to develop computational methods focused on processing this text data. A line of study focused on condensing the text so that we are able to get a higher level of understanding in a shorter time. The two important tasks to do this are keyword extraction and text summarization. In keyword extraction, we are interested in finding the key important words from a text. This makes us familiar with the general topic of a text. In text summarization, we are interested in producing a short-length text which includes important information about the document. The TextRank algorithm, an unsupervised learning method that is an extension of the PageRank (algorithm which is the base algorithm of Google search engine for searching pages and ranking them), has shown its efficacy in large-scale text mining, especially for text summarization and keyword extraction. This algorithm can automatically extract the important parts of a text (keywords or sentences) and declare them as a result. However, this algorithm neglects the semantic similarity between the different parts. In this work, we improved the results of the TextRank algorithm by incorporating the semantic similarity between parts of the text. Aside from keyword extraction and text summarization, we develop a topic clustering algorithm based on our framework, which can be used individually or as a part of generating the summary to overcome coverage problems.Keywords: keyword extraction, n-gram extraction, text summarization, topic clustering, semantic analysis
Procedia PDF Downloads 711255 Investor Sentiment and Satisfaction in Automated Investment: A Sentimental Analysis of Robo-Advisor Platforms
Authors: Vertika Goswami, Gargi Sharma
Abstract:
The rapid evolution of fintech has led to the rise of robo-advisor platforms that utilize artificial intelligence (AI) and machine learning to offer personalized investment solutions efficiently and cost-effectively. This research paper conducts a comprehensive sentiment analysis of investor experiences with these platforms, employing natural language processing (NLP) and sentiment classification techniques. The study investigates investor perceptions, engagement, and satisfaction, identifying key drivers of positive sentiment such as clear communication, low fees, consistent returns, and robust security. Conversely, negative sentiment is linked to issues like inconsistent performance, hidden fees, poor customer support, and a lack of transparency. The analysis reveals that addressing these pain points—through improved transparency, enhanced customer service, and ongoing technological advancements—can significantly boost investor trust and satisfaction. This paper contributes valuable insights into the fields of behavioral finance and fintech innovation, offering actionable recommendations for stakeholders, practitioners, and policymakers. Future research should explore the long-term impact of these factors on investor loyalty, the role of emerging technologies, and the effects of ethical investment choices and regulatory compliance on investor sentiment.Keywords: artificial intelligence in finance, automated investment, financial technology, investor satisfaction, investor sentiment, robo-advisors, sentimental analysis
Procedia PDF Downloads 171254 Artificial Intelligence in Art and Other Sectors: Selected Aspects of Mutual Impact
Authors: Justyna Minkiewicz
Abstract:
Artificial Intelligence (AI) applied in the arts may influence the development of AI knowledge in other sectors and then also impact mutual collaboration with the artistic environment. Hence this collaboration may also impact the development of art projects. The paper will reflect the qualitative research outcomes based on in-depth (IDI) interviews within the marketing sector in Poland and desk research. Art is a reflection of the spirit of our times. Moreover, now we are experiencing a significant acceleration in the development of technologies and their use in various sectors. The leading technologies that contribute to the development of the economy, including the creative sector, embrace technologies such as artificial intelligence, blockchain, extended reality, voice processing, and virtual beings. Artificial intelligence is one of the leading technologies developed for several decades, which is currently reaching a high level of interest and use in various sectors. However, the conducted research has shown that there is still low awareness of artificial intelligence and its wide application in various sectors. The study will show how artists use artificial intelligence in their art projects and how it can be translated into practice within the business. At the same time, the paper will raise awareness of the need for businesses to be inspired by the artistic environment. The research proved that there is still a need to popularize knowledge about this technology which is crucial for many sectors. Art projects are tools to develop knowledge and awareness of society and also various sectors. At the same time, artists may benefit from such collaboration. The paper will include selected aspects of mutual relations, areas of possible inspiration, and possible transfers of technological solutions. Those are AI applications in creative industries such as advertising and film, image recognition in art, and projects from different sectors.Keywords: artificial intelligence, business, art, creative industry, technology
Procedia PDF Downloads 1051253 Starchy Wastewater as Raw Material for Biohydrogen Production by Dark Fermentation: A Review
Authors: Tami A. Ulhiza, Noor I. M. Puad, Azlin S. Azmi, Mohd. I. A. Malek
Abstract:
High amount of chemical oxygen demand (COD) in starchy waste can be harmful to the environment. In common practice, starch processing wastewater is discharged to the river without proper treatment. However, starchy waste still contains complex sugars and organic acids. By the right pretreatment method, the complex sugar can be hydrolyzed into more readily digestible sugars which can be utilized to be converted into more valuable products. At the same time, the global demand of energy is inevitable. The continuous usage of fossil fuel as the main source of energy can lead to energy scarcity. Hydrogen is a renewable form of energy which can be an alternative energy in the future. Moreover, hydrogen is clean and carries the highest energy compared to other fuels. Biohydrogen produced from waste has significant advantages over chemical methods. One of the major problems in biohydrogen production is the raw material cost. The carbohydrate-rich starchy wastes such as tapioca, maize, wheat, potato, and sago wastes is a promising candidate to be used as a substrate in producing biohydrogen. The utilization of those wastes for biohydrogen production can provide cheap energy generation with simultaneous waste treatment. Therefore this paper aims to review variety source of starchy wastes that has been widely used to synthesize biohydrogen. The scope includes the source of waste, the performance in yielding hydrogen, the pretreatment method and the type of culture that is suitable for starchy waste.Keywords: biohydrogen, dark fermentation, renewable energy, starchy waste
Procedia PDF Downloads 2231252 Comparative Study on Sensory Profiles of Liquor from Different Dried Cocoa Beans
Authors: Khairul Bariah Sulaiman, Tajul Aris Yang
Abstract:
Malaysian dried cocoa beans have been reported to have low quality flavour and are often sold at discounted prices. Various efforts have been made to improve the Malaysian beans quality. Among these efforts is introduction of the shallow box fermentation technique and pulp preconditioned through pods storage. However, after nearly four decades of the effort was done, Malaysian cocoa farmers still received lower prices for their beans. So, this study was carried out in order to assess the flavour quality of dried cocoa beans produced by shallow box fermentation techniques, combination of shallow box fermentation with pods storage and compared to dried cocoa beans obtained from Ghana. A total of eight samples of dried cocoa was used in this study, which one of the samples was Ghanaian beans (coded with no.8), while the rest were Malaysian cocoa beans with different post-harvest processing (coded with no. 1, 2, 3, 4, 5, 6 and 7). Cocoa liquor was prepared from all samples in the prescribed techniques and sensory evaluation was carried out using Quantitative Descriptive Analysis (QDA) Method with 0-10 scale by Malaysian Cocoa Board trained panelist. Sensory evaluation showed that cocoa attributes for all cocoa liquors ranging from 3.5 to 5.3, whereas bitterness was ranging from 3.4 to 4.6 and astringent attribute ranging from 3.9 to 5.5, respectively. Meanwhile, all cocoa liquors were having acid or sourness attribute ranging from 1.6 to 3.6, respectively. In general cocoa liquor prepared from sample coded no 4 has almost similar flavour profile and no significantly different at p < 0.05 with Ghana, in term of most flavour attributes as compared to the other six samples.Keywords: cocoa beans, flavour, fermentation, shallow box, pods storage
Procedia PDF Downloads 3941251 Automated User Story Driven Approach for Web-Based Functional Testing
Authors: Mahawish Masud, Muhammad Iqbal, M. U. Khan, Farooque Azam
Abstract:
Manual writing of test cases from functional requirements is a time-consuming task. Such test cases are not only difficult to write but are also challenging to maintain. Test cases can be drawn from the functional requirements that are expressed in natural language. However, manual test case generation is inefficient and subject to errors. In this paper, we have presented a systematic procedure that could automatically derive test cases from user stories. The user stories are specified in a restricted natural language using a well-defined template. We have also presented a detailed methodology for writing our test ready user stories. Our tool “Test-o-Matic” automatically generates the test cases by processing the restricted user stories. The generated test cases are executed by using open source Selenium IDE. We evaluate our approach on a case study, which is an open source web based application. Effectiveness of our approach is evaluated by seeding faults in the open source case study using known mutation operators. Results show that the test case generation from restricted user stories is a viable approach for automated testing of web applications.Keywords: automated testing, natural language, restricted user story modeling, software engineering, software testing, test case specification, transformation and automation, user story, web application testing
Procedia PDF Downloads 3871250 Functional Yoghurt Enriched with Microencapsulated Olive Leaves Extract Powder Using Polycaprolactone via Double Emulsion/Solvent Evaporation Technique
Authors: Tamer El-Messery, Teresa Sanchez-Moya, Ruben Lopez-Nicolas, Gaspar Ros, Esmat Aly
Abstract:
Olive leaves (OLs), the main by-product of the olive oil industry, have a considerable amount of phenolic compounds. The exploitation of these compounds represents the current trend in food processing. In this study, OLs polyphenols were microencapsulated with polycaprolactone (PCL) and utilized in formulating novel functional yoghurt. PCL-microcapsules were characterized by scanning electron microscopy, and Fourier transform infrared spectrometry analysis. Their total phenolic (TPC), total flavonoid (TFC) contents, and antioxidant activities (DPPH, FRAP, ABTS), and polyphenols bioaccessibility were measured after oral, gastric, and intestinal steps of in vitro digestion. The four yoghurt formulations (containing 0, 25, 50, and 75 mg of PCL-microsphere/100g yoghurt) were evaluated for their pH, acidity, syneresis viscosity, and color during storage. In vitro digestion significantly affected the phenolic composition in non-encapsulated extract while had a lower impact on encapsulated phenolics. Higher protection was provided for encapsulated OLs extract, and their higher release was observed at the intestinal phase. Yoghurt with PCL-microsphere had lower viscosity, syneresis, and color parameters, as compared to control yoghurt. Thus, OLs represent a valuable and cheap source of polyphenols which can be successfully applied, in microencapsulated form, to formulate functional yoghurt.Keywords: yoghurt quality attributes, olive leaves, phenolic and flavonoids compounds, antioxidant activity, polycaprolactone as microencapsulant
Procedia PDF Downloads 1421249 Light-Weight Network for Real-Time Pose Estimation
Authors: Jianghao Hu, Hongyu Wang
Abstract:
The effective and efficient human pose estimation algorithm is an important task for real-time human pose estimation on mobile devices. This paper proposes a light-weight human key points detection algorithm, Light-Weight Network for Real-Time Pose Estimation (LWPE). LWPE uses light-weight backbone network and depthwise separable convolutions to reduce parameters and lower latency. LWPE uses the feature pyramid network (FPN) to fuse the high-resolution, semantically weak features with the low-resolution, semantically strong features. In the meantime, with multi-scale prediction, the predicted result by the low-resolution feature map is stacked to the adjacent higher-resolution feature map to intermediately monitor the network and continuously refine the results. At the last step, the key point coordinates predicted in the highest-resolution are used as the final output of the network. For the key-points that are difficult to predict, LWPE adopts the online hard key points mining strategy to focus on the key points that hard predicting. The proposed algorithm achieves excellent performance in the single-person dataset selected in the AI (artificial intelligence) challenge dataset. The algorithm maintains high-precision performance even though the model only contains 3.9M parameters, and it can run at 225 frames per second (FPS) on the generic graphics processing unit (GPU).Keywords: depthwise separable convolutions, feature pyramid network, human pose estimation, light-weight backbone
Procedia PDF Downloads 1541248 Business Continuity Risk Review for a Large Petrochemical Complex
Authors: Michel A. Thomet
Abstract:
A discrete-event simulation model was used to perform a Reliability-Availability-Maintainability (RAM) study of a large petrochemical complex which included sixteen process units, and seven feeds and intermediate streams. All the feeds and intermediate streams have associated storage tanks, so that if a processing unit fails and shuts down, the downstream units can keep producing their outputs. This also helps the upstream units which do not have to reduce their outputs, but can store their excess production until the failed unit restart. Each process unit and each pipe section carrying the feeds and intermediate streams has a probability of failure with an associated distribution and a Mean Time Between Failure (MTBF), as well as a distribution of the time to restore and a Mean Time To Restore (MTTR). The utilities supporting the process units can also fail and have their own distributions with specific MTBF and MTTR. The model runs are for ten years or more and the runs are repeated several times to obtain statistically relevant results. One of the main results is the On-Stream factor (OSF) of each process unit (percent of hours in a year when the unit is running in nominal conditions). One of the objectives of the study was to investigate if the storage capacity of each of the feeds and the intermediate stream was adequate. This was done by increasing the storage capacities in several steps and through running the simulation to see if the OSF were improved and by how much. Other objectives were to see if the failure of the utilities were an important factor in the overall OSF, and what could be done to reduce their failure rates through redundant equipment.Keywords: business continuity, on-stream factor, petrochemical, RAM study, simulation, MTBF
Procedia PDF Downloads 2191247 Space Time Adaptive Algorithm in Bi-Static Passive Radar Systems for Clutter Mitigation
Authors: D. Venu, N. V. Koteswara Rao
Abstract:
Space – time adaptive processing (STAP) is an effective tool for detecting a moving target in spaceborne or airborne radar systems. Since airborne passive radar systems utilize broadcast, navigation and excellent communication signals to perform various surveillance tasks and also has attracted significant interest from the distinct past, therefore the need of the hour is to have cost effective systems as compared to conventional active radar systems. Moreover, requirements of small number of secondary samples for effective clutter suppression in bi-static passive radar offer abundant illuminator resources for passive surveillance radar systems. This paper presents a framework for incorporating knowledge sources directly in the space-time beam former of airborne adaptive radars. STAP algorithm for clutter mitigation for passive bi-static radar has better quantitation of the reduction in sample size thereby amalgamating the earlier data bank with existing radar data sets. Also, we proposed a novel method to estimate the clutter matrix and perform STAP for efficient clutter suppression based on small sample size. Furthermore, the effectiveness of the proposed algorithm is verified using MATLAB simulations in order to validate STAP algorithm for passive bi-static radar. In conclusion, this study highlights the importance for various applications which augments traditional active radars using cost-effective measures.Keywords: bistatic radar, clutter, covariance matrix passive radar, STAP
Procedia PDF Downloads 2951246 Developing an Out-of-Distribution Generalization Model Selection Framework through Impurity and Randomness Measurements and a Bias Index
Authors: Todd Zhou, Mikhail Yurochkin
Abstract:
Out-of-distribution (OOD) detection is receiving increasing amounts of attention in the machine learning research community, boosted by recent technologies, such as autonomous driving and image processing. This newly-burgeoning field has called for the need for more effective and efficient methods for out-of-distribution generalization methods. Without accessing the label information, deploying machine learning models to out-of-distribution domains becomes extremely challenging since it is impossible to evaluate model performance on unseen domains. To tackle this out-of-distribution detection difficulty, we designed a model selection pipeline algorithm and developed a model selection framework with different impurity and randomness measurements to evaluate and choose the best-performing models for out-of-distribution data. By exploring different randomness scores based on predicted probabilities, we adopted the out-of-distribution entropy and developed a custom-designed score, ”CombinedScore,” as the evaluation criterion. This proposed score was created by adding labeled source information into the judging space of the uncertainty entropy score using harmonic mean. Furthermore, the prediction bias was explored through the equality of opportunity violation measurement. We also improved machine learning model performance through model calibration. The effectiveness of the framework with the proposed evaluation criteria was validated on the Folktables American Community Survey (ACS) datasets.Keywords: model selection, domain generalization, model fairness, randomness measurements, bias index
Procedia PDF Downloads 1241245 Bottleneck Modeling in Information Technology Service Management
Authors: Abhinay Puvvala, Veerendra Kumar Rai
Abstract:
A bottleneck situation arises when the outflow is lesser than the inflow in a pipe-like setup. A more practical interpretation of bottlenecks emphasizes on the realization of Service Level Objectives (SLOs) at given workloads. Our approach detects two key aspects of bottlenecks – when and where. To identify ‘when’ we continuously poll on certain key metrics such as resource utilization, processing time, request backlog and throughput at a system level. Further, when the slope of the expected sojourn time at a workload is greater than ‘K’ times the slope of expected sojourn time at the previous step of the workload while the workload is being gradually increased in discrete steps, a bottleneck situation arises. ‘K’ defines the threshold condition and is computed based on the system’s service level objectives. The second aspect of our approach is to identify the location of the bottleneck. In multi-tier systems with a complex network of layers, it is a challenging problem to locate bottleneck that affects the overall system performance. We stage the system by varying workload incrementally to draw a correlation between load increase and system performance to the point where Service Level Objectives are violated. During the staging process, multiple metrics are monitored at hardware and application levels. The correlations are drawn between metrics and the overall system performance. These correlations along with the Service Level Objectives are used to arrive at the threshold conditions for each of these metrics. Subsequently, the same method used to identify when a bottleneck occurs is used on metrics data with threshold conditions to locate bottlenecks.Keywords: bottleneck, workload, service level objectives (SLOs), throughput, system performance
Procedia PDF Downloads 2361244 Visual Inspection of Road Conditions Using Deep Convolutional Neural Networks
Authors: Christos Theoharatos, Dimitris Tsourounis, Spiros Oikonomou, Andreas Makedonas
Abstract:
This paper focuses on the problem of visually inspecting and recognizing the road conditions in front of moving vehicles, targeting automotive scenarios. The goal of road inspection is to identify whether the road is slippery or not, as well as to detect possible anomalies on the road surface like potholes or body bumps/humps. Our work is based on an artificial intelligence methodology for real-time monitoring of road conditions in autonomous driving scenarios, using state-of-the-art deep convolutional neural network (CNN) techniques. Initially, the road and ego lane are segmented within the field of view of the camera that is integrated into the front part of the vehicle. A novel classification CNN is utilized to identify among plain and slippery road textures (e.g., wet, snow, etc.). Simultaneously, a robust detection CNN identifies severe surface anomalies within the ego lane, such as potholes and speed bumps/humps, within a distance of 5 to 25 meters. The overall methodology is illustrated under the scope of an integrated application (or system), which can be integrated into complete Advanced Driver-Assistance Systems (ADAS) systems that provide a full range of functionalities. The outcome of the proposed techniques present state-of-the-art detection and classification results and real-time performance running on AI accelerator devices like Intel’s Myriad 2/X Vision Processing Unit (VPU).Keywords: deep learning, convolutional neural networks, road condition classification, embedded systems
Procedia PDF Downloads 1341243 Development of Latent Fingerprints on Non-Porous Surfaces Recovered from Fresh and Sea Water
Authors: A. Somaya Madkour, B. Abeer sheta, C. Fatma Badr El Dine, D. Yasser Elwakeel, E. Nermine AbdAllah
Abstract:
Criminal offenders have a fundamental goal not to leave any traces at the crime scene. Some may suppose that items recovered underwater will have no forensic value, therefore, they try to destroy the traces by throwing items in water. These traces are subjected to the destructive environmental effects. This can represent a challenge for Forensic experts investigating finger marks. Accordingly, the present study was conducted to determine the optimal method for latent fingerprints development on non-porous surfaces submerged in aquatic environments at different time interval. The two factors analyzed in this study were the nature of aquatic environment and length of submerged time. In addition, the quality of developed finger marks depending on the used method was also assessed. Therefore, latent fingerprints were deposited on metallic, plastic and glass objects and submerged in fresh or sea water for one, two, and ten days. After recovery, the items were subjected to cyanoacrylate fuming, black powder and small particle reagent processing and the prints were examined. Each print was evaluated according to fingerprint quality assessment scale. The present study demonstrated that the duration of submersion affects the quality of finger marks; the longer the duration, the worse the quality.The best results of visualization were achieved using cyanoacrylate either in fresh or sea water. This study has also revealed that the exposure to sea water had more destructive influence on the quality of detected finger marks.Keywords: fingerprints, fresh water, sea, non-porous
Procedia PDF Downloads 4551242 Review of Microstructure, Mechanical and Corrosion Behavior of Aluminum Matrix Composite Reinforced with Agro/Industrial Waste Fabricated by Stir Casting Process
Authors: Mehari Kahsay, Krishna Murthy Kyathegowda, Temesgen Berhanu
Abstract:
Aluminum matrix composites have gained focus on research and industrial use, especially those not requiring extreme loading or thermal conditions, for the last few decades. Their relatively low cost, simple processing and attractive properties are the reasons for the widespread use of aluminum matrix composites in the manufacturing of automobiles, aircraft, military, and sports goods. In this article, the microstructure, mechanical, and corrosion behaviors of the aluminum metal matrix were reviewed, focusing on the stir casting fabrication process and usage of agro/industrial waste reinforcement particles. The results portrayed that mechanical properties like tensile strength, ultimate tensile strength, hardness, percentage of elongation, impact, and fracture toughness are highly dependent on the amount, kind, and size of reinforcing particles. Additionally, uniform distribution, wettability of reinforcement particles, and the porosity level of the resulting composite also affect the mechanical and corrosion behaviors of aluminum matrix composites. The two-step stir-casting process resulted in better wetting characteristics, a lower porosity level, and a uniform distribution of particles with proper handling of process parameters. On the other hand, the inconsistent and contradicting results on corrosion behavior regarding monolithic and hybrid aluminum matrix composites need further study.Keywords: microstructure, mechanical behavior, corrosion, aluminum matrix composite
Procedia PDF Downloads 731241 Artificial Neural Network in Ultra-High Precision Grinding of Borosilicate-Crown Glass
Authors: Goodness Onwuka, Khaled Abou-El-Hossein
Abstract:
Borosilicate-crown (BK7) glass has found broad application in the optic and automotive industries and the growing demands for nanometric surface finishes is becoming a necessity in such applications. Thus, it has become paramount to optimize the parameters influencing the surface roughness of this precision lens. The research was carried out on a 4-axes Nanoform 250 precision lathe machine with an ultra-high precision grinding spindle. The experiment varied the machining parameters of feed rate, wheel speed and depth of cut at three levels for different combinations using Box Behnken design of experiment and the resulting surface roughness values were measured using a Taylor Hobson Dimension XL optical profiler. Acoustic emission monitoring technique was applied at a high sampling rate to monitor the machining process while further signal processing and feature extraction methods were implemented to generate the input to a neural network algorithm. This paper highlights the training and development of a back propagation neural network prediction algorithm through careful selection of parameters and the result show a better classification accuracy when compared to a previously developed response surface model with very similar machining parameters. Hence artificial neural network algorithms provide better surface roughness prediction accuracy in the ultra-high precision grinding of BK7 glass.Keywords: acoustic emission technique, artificial neural network, surface roughness, ultra-high precision grinding
Procedia PDF Downloads 3051240 An Automated Approach to the Nozzle Configuration of Polycrystalline Diamond Compact Drill Bits for Effective Cuttings Removal
Authors: R. Suresh, Pavan Kumar Nimmagadda, Ming Zo Tan, Shane Hart, Sharp Ugwuocha
Abstract:
Polycrystalline diamond compact (PDC) drill bits are extensively used in the oil and gas industry as well as the mining industry. Industry engineers continually improve upon PDC drill bit designs and hydraulic conditions. Optimized injection nozzles play a key role in improving the drilling performance and efficiency of these ever changing PDC drill bits. In the first part of this study, computational fluid dynamics (CFD) modelling is performed to investigate the hydrodynamic characteristics of drilling fluid flow around the PDC drill bit. An Open-source CFD software – OpenFOAM simulates the flow around the drill bit, based on the field input data. A specifically developed console application integrates the entire CFD process including, domain extraction, meshing, and solving governing equations and post-processing. The results from the OpenFOAM solver are then compared with that of the ANSYS Fluent software. The data from both software programs agree. The second part of the paper describes the parametric study of the PDC drill bit nozzle to determine the effect of parameters such as number of nozzles, nozzle velocity, nozzle radial position and orientations on the flow field characteristics and bit washing patterns. After analyzing a series of nozzle configurations, the best configuration is identified and recommendations are made for modifying the PDC bit design.Keywords: ANSYS Fluent, computational fluid dynamics, nozzle configuration, OpenFOAM, PDC dill bit
Procedia PDF Downloads 4201239 A Recommender System for Job Seekers to Show up Companies Based on Their Psychometric Preferences and Company Sentiment Scores
Authors: A. Ashraff
Abstract:
The increasing importance of the web as a medium for electronic and business transactions has served as a catalyst or rather a driving force for the introduction and implementation of recommender systems. Recommender Systems play a major role in processing and analyzing thousands of data rows or reviews and help humans make a purchase decision of a product or service. It also has the ability to predict whether a particular user would rate a product or service based on the user’s profile behavioral pattern. At present, Recommender Systems are being used extensively in every domain known to us. They are said to be ubiquitous. However, in the field of recruitment, it’s not being utilized exclusively. Recent statistics show an increase in staff turnover, which has negatively impacted the organization as well as the employee. The reasons being company culture, working flexibility (work from home opportunity), no learning advancements, and pay scale. Further investigations revealed that there are lacking guidance or support, which helps a job seeker find the company that will suit him best, and though there’s information available about companies, job seekers can’t read all the reviews by themselves and get an analytical decision. In this paper, we propose an approach to study the available review data on IT companies (score their reviews based on user review sentiments) and gather information on job seekers, which includes their Psychometric evaluations. Then presents the job seeker with useful information or rather outputs on which company is most suitable for the job seeker. The theoretical approach, Algorithmic approach and the importance of such a system will be discussed in this paper.Keywords: psychometric tests, recommender systems, sentiment analysis, hybrid recommender systems
Procedia PDF Downloads 106