Search results for: rRNA processing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3864

Search results for: rRNA processing

1074 Simulation and Fabrication of Plasmonic Lens for Bacteria Detection

Authors: Sangwoo Oh, Jaewoo Kim, Dongmin Seo, Jaewon Park, Yongha Hwang, Sungkyu Seo

Abstract:

Plasmonics has been regarded one of the most powerful bio-sensing modalities to evaluate bio-molecular interactions in real-time. However, most of the plasmonic sensing methods are based on labeling metallic nanoparticles, e.g. gold or silver, as optical modulation markers, which are non-recyclable and expensive. This plasmonic modulation can be usually achieved through various nano structures, e.g., nano-hole arrays. Among those structures, plasmonic lens has been regarded as a unique plasmonic structure due to its light focusing characteristics. In this study, we introduce a custom designed plasmonic lens array for bio-sensing, which was simulated by finite-difference-time-domain (FDTD) approach and fabricated by top-down approach. In our work, we performed the FDTD simulations of various plasmonic lens designs for bacteria sensor, i.e., Samonella and Hominis. We optimized the design parameters, i.e., radius, shape, and material, of the plasmonic lens. The simulation results showed the change in the peak intensity value with the introduction of each bacteria and antigen i.e., peak intensity 1.8711 a.u. with the introduction of antibody layer of thickness of 15nm. For Salmonella, the peak intensity changed from 1.8711 a.u. to 2.3654 a.u. and for Hominis, the peak intensity changed from 1.8711 a.u. to 3.2355 a.u. This significant shift in the intensity due to the interaction between bacteria and antigen showed a promising sensing capability of the plasmonic lens. With the batch processing and bulk production of this nano scale design, the cost of biological sensing can be significantly reduced, holding great promise in the fields of clinical diagnostics and bio-defense.

Keywords: plasmonic lens, FDTD, fabrication, bacteria sensor, salmonella, hominis

Procedia PDF Downloads 270
1073 Integrated Two Stage Processing of Biomass Conversion to Hydroxymethylfurfural Esters Using Ionic Liquid as Green Solvent and Catalyst: Synthesis of Mono Esters

Authors: Komal Kumar, Sreedevi Upadhyayula

Abstract:

In this study, a two-stage process was established for the synthesis of HMF esters using ionic liquid acid catalyst. Ionic liquid catalyst with different strength of the Bronsted acidity was prepared in the laboratory and characterized using 1H NMR, FT-IR, and 13C NMR spectroscopy. Solid acid catalyst from the ionic liquid catalyst was prepared using the immobilization method. The acidity of the synthesized acid catalyst was measured using Hammett function and titration method. Catalytic performance was evaluated for the biomass conversion to 5-hydroxymethylfurfural (5-HMF) and levulinic acid (LA) in methyl isobutyl ketone (MIBK)-water biphasic system. A good yield of 5-HMF and LA was found at the different composition of MIBK: Water. In the case of MIBK: Water ratio 10:1, good yield of 5-HMF was observed at ambient temperature 150˚C. Upgrading of 5-HMF into monoesters from the reaction of 5-HMF and reactants using biomass-derived monoacid were performed. Ionic liquid catalyst with -SO₃H functional group was found to be best efficient in comparative of a solid acid catalyst for the esterification reaction and biomass conversion. A good yield of 5-HMF esters with high 5-HMF conversion was found to be at 105˚C using the best active catalyst. In this process, process A was the hydrothermal conversion of cellulose and monomer into 5-HMF and LA using acid catalyst. And the process B was the esterification followed by using similar acid catalyst. All monoesters of 5-HMF synthesized here can be used in chemical, cross linker for adhesive or coatings and pharmaceutical industry. A theoretical density functional theory (DFT) study for the optimization of the ionic liquid structure was performed using the Gaussian 09 program to find out the minimum energy configuration of ionic liquid catalyst.

Keywords: biomass conversion, 5-HMF, Ionic liquid, HMF ester

Procedia PDF Downloads 251
1072 Fully Automated Methods for the Detection and Segmentation of Mitochondria in Microscopy Images

Authors: Blessing Ojeme, Frederick Quinn, Russell Karls, Shannon Quinn

Abstract:

The detection and segmentation of mitochondria from fluorescence microscopy are crucial for understanding the complex structure of the nervous system. However, the constant fission and fusion of mitochondria and image distortion in the background make the task of detection and segmentation challenging. In the literature, a number of open-source software tools and artificial intelligence (AI) methods have been described for analyzing mitochondrial images, achieving remarkable classification and quantitation results. However, the availability of combined expertise in the medical field and AI required to utilize these tools poses a challenge to its full adoption and use in clinical settings. Motivated by the advantages of automated methods in terms of good performance, minimum detection time, ease of implementation, and cross-platform compatibility, this study proposes a fully automated framework for the detection and segmentation of mitochondria using both image shape information and descriptive statistics. Using the low-cost, open-source python and openCV library, the algorithms are implemented in three stages: pre-processing, image binarization, and coarse-to-fine segmentation. The proposed model is validated using the mitochondrial fluorescence dataset. Ground truth labels generated using a Lab kit were also used to evaluate the performance of our detection and segmentation model. The study produces good detection and segmentation results and reports the challenges encountered during the image analysis of mitochondrial morphology from the fluorescence mitochondrial dataset. A discussion on the methods and future perspectives of fully automated frameworks conclude the paper.

Keywords: 2D, binarization, CLAHE, detection, fluorescence microscopy, mitochondria, segmentation

Procedia PDF Downloads 357
1071 Influence of Travel Time Reliability on Elderly Drivers Crash Severity

Authors: Ren Moses, Emmanuel Kidando, Eren Ozguven, Yassir Abdelrazig

Abstract:

Although older drivers (defined as those of age 65 and above) are less involved with speeding, alcohol use as well as night driving, they are more vulnerable to severe crashes. The major contributing factors for severe crashes include frailty and medical complications. Several studies have evaluated the contributing factors on severity of crashes. However, few studies have established the impact of travel time reliability (TTR) on road safety. In particular, the impact of TTR on senior adults who face several challenges including hearing difficulties, decreasing of the processing skills and cognitive problems in driving is not well established. Therefore, this study focuses on determining possible impacts of TTR on the traffic safety with focus on elderly drivers. Historical travel speed data from freeway links in the study area were used to calculate travel time and the associated TTR metrics that is, planning time index, the buffer index, the standard deviation of the travel time and the probability of congestion. Four-year information on crashes occurring on these freeway links was acquired. The binary logit model estimated using the Markov Chain Monte Carlo (MCMC) sampling technique was used to evaluate variables that could be influencing elderly crash severity. Preliminary results of the analysis suggest that TTR is statistically significant in affecting the severity of a crash involving an elderly driver. The result suggests that one unit increase in the probability of congestion reduces the likelihood of the elderly severe crash by nearly 22%. These findings will enhance the understanding of TTR and its impact on the elderly crash severity.

Keywords: highway safety, travel time reliability, elderly drivers, traffic modeling

Procedia PDF Downloads 493
1070 Application of New Sprouted Wheat Brine for Delicatessen Products From Horse Meat, Beef and Pork

Authors: Gulmira Kenenbay, Urishbay Chomanov, Aruzhan Shoman, Rabiga Kassimbek

Abstract:

The main task of the meat-processing industry is the production of meat products as the main source of animal protein, ensuring the vital activity of the human body, in the required volumes, high quality, diverse assortment. Providing the population with high-quality food products what are biologically full, balanced in composition of basic nutrients and enriched by targeted physiologically active components, is one of the highest priority scientific and technical problems to be solved. In this regard, the formulation of a new brine from sprouted wheat for meat delicacies from horse meat, beef and pork has been developed. The new brine contains flavored aromatic ingredients, juice of the germinated wheat and vegetable juice. The viscosity of meat of horse meat, beef and pork were studied during massaging. Thermodynamic indices, water activity and binding energy of horse meat, beef and pork with application of new brine are investigated. A recipe for meat products with vegetable additives has been developed. Organoleptic evaluation of meat products was carried out. Physicochemical parameters of meat products with vegetable additives are carried out. Analysis of the obtained data shows that the values of the index aw (water activity) and the binding energy of moisture in the experimental samples of meat products are higher than in the control samples. It has been established by investigations that with increasing water activity and the binding energy of moisture, the tenderness of ready meat delicacies increases with the use of a new brine.

Keywords: compounding, functional products, delicatessen products, brine, vegetable additives

Procedia PDF Downloads 178
1069 Mapping of Geological Structures Using Aerial Photography

Authors: Ankit Sharma, Mudit Sachan, Anurag Prakash

Abstract:

Rapid growth in data acquisition technologies through drones, have led to advances and interests in collecting high-resolution images of geological fields. Being advantageous in capturing high volume of data in short flights, a number of challenges have to overcome for efficient analysis of this data, especially while data acquisition, image interpretation and processing. We introduce a method that allows effective mapping of geological fields using photogrammetric data of surfaces, drainage area, water bodies etc, which will be captured by airborne vehicles like UAVs, we are not taking satellite images because of problems in adequate resolution, time when it is captured may be 1 yr back, availability problem, difficult to capture exact image, then night vision etc. This method includes advanced automated image interpretation technology and human data interaction to model structures and. First Geological structures will be detected from the primary photographic dataset and the equivalent three dimensional structures would then be identified by digital elevation model. We can calculate dip and its direction by using the above information. The structural map will be generated by adopting a specified methodology starting from choosing the appropriate camera, camera’s mounting system, UAVs design ( based on the area and application), Challenge in air borne systems like Errors in image orientation, payload problem, mosaicing and geo referencing and registering of different images to applying DEM. The paper shows the potential of using our method for accurate and efficient modeling of geological structures, capture particularly from remote, of inaccessible and hazardous sites.

Keywords: digital elevation model, mapping, photogrammetric data analysis, geological structures

Procedia PDF Downloads 686
1068 The Weavability of Waste Plants and Their Application in Fashion and Textile Design

Authors: Jichi Wu

Abstract:

The dwindling of resources requires a more sustainable design. New technology could bring new materials and processing techniques to the fashion industry and push it to a more sustainable future. Thus this paper explores cutting-edge researches on the life-cycle of closed-loop products and aims to find innovative ways to recycle and upcycle. For such a goal, the author investigated how low utilization plants and leftover fiber could be turned into ecological textiles in fashion. Through examining the physical and chemical properties (cellulose content/ fiber form) of ecological textiles to explore their wearability, this paper analyzed the prospect of bio-fabrics (weavable plants) in body-oriented fashion design and their potential in sustainable fashion and textile design. By extracting cellulose from 9 different types or sections of plants, the author intends to find an appropriate method (such as ion solution extraction) to mostly increase the weavability of plants, so raw materials could be more effectively changed into fabrics. All first-hand experiment data were carefully collected and then analyzed under the guidance of related theories. The result of the analysis was recorded in detail and presented in an understandable way. Various research methods are adopted through this project, including field trip and experiments to make comparisons and recycle materials. Cross-discipline cooperation is also conducted for related knowledge and theories. From this, experiment data will be collected, analyzed, and interpreted into a description and visualization results. Based on the above conclusions, it is possible to apply weavable plant fibres to develop new textile and fashion.

Keywords: wearable bio-textile, sustainability, economy, ecology, technology, weavability, fashion design

Procedia PDF Downloads 148
1067 The Effect of Main Factors on Forces during FSJ Processing of AA2024 Aluminum

Authors: Dunwen Zuo, Yongfang Deng, Bo Song

Abstract:

An attempt is made here to measure the forces of three directions, under conditions of different feed speeds, different tilt angles of tool and without or with the pin on the tool, by using octagonal ring dynamometer in the AA2024 aluminum FSJ (Friction Stir Joining) process, and investigate how four main factors influence forces in the FSJ process. It is found that, high feed speed lead to small feed force and small lateral force, but high feed speed leads to large feed force in the stable joining stage of process. As the rotational speed increasing, the time of axial force drop from the maximum to the minimum required increased in the push-up process. In the stable joining stage, the rotational speed has little effect on the feed force; large rotational speed leads to small lateral force and axial force. The maximum axial force increases as the tilt angle of tool increases at the downward movement stage. At the moment of start feeding, as tilt angle of tool increases, the amplitudes of the axial force increasing become large. In the stable joining stage, with the increase of tilt angle of tool, the axial force is increased, the lateral force is decreased, and the feed force almost unchanged. The tool with pin will decrease axial force in the downward movement stage. The feed force and lateral force will increase, but the axial force will reduced in the stable joining stage by using the tool with pin compare to by using the tool without pin.

Keywords: FSJ, force factor, AA2024 aluminum, friction stir joining

Procedia PDF Downloads 491
1066 Memory Based Reinforcement Learning with Transformers for Long Horizon Timescales and Continuous Action Spaces

Authors: Shweta Singh, Sudaman Katti

Abstract:

The most well-known sequence models make use of complex recurrent neural networks in an encoder-decoder configuration. The model used in this research makes use of a transformer, which is based purely on a self-attention mechanism, without relying on recurrence at all. More specifically, encoders and decoders which make use of self-attention and operate based on a memory, are used. In this research work, results for various 3D visual and non-visual reinforcement learning tasks designed in Unity software were obtained. Convolutional neural networks, more specifically, nature CNN architecture, are used for input processing in visual tasks, and comparison with standard long short-term memory (LSTM) architecture is performed for both visual tasks based on CNNs and non-visual tasks based on coordinate inputs. This research work combines the transformer architecture with the proximal policy optimization technique used popularly in reinforcement learning for stability and better policy updates while training, especially for continuous action spaces, which are used in this research work. Certain tasks in this paper are long horizon tasks that carry on for a longer duration and require extensive use of memory-based functionalities like storage of experiences and choosing appropriate actions based on recall. The transformer, which makes use of memory and self-attention mechanism in an encoder-decoder configuration proved to have better performance when compared to LSTM in terms of exploration and rewards achieved. Such memory based architectures can be used extensively in the field of cognitive robotics and reinforcement learning.

Keywords: convolutional neural networks, reinforcement learning, self-attention, transformers, unity

Procedia PDF Downloads 136
1065 Research on Level Adjusting Mechanism System of Large Space Environment Simulator

Authors: Han Xiao, Zhang Lei, Huang Hai, Lv Shizeng

Abstract:

Space environment simulator is a device for spacecraft test. KM8 large space environment simulator built in Tianjing Space City is the largest as well as the most advanced space environment simulator in China. Large deviation of spacecraft level will lead to abnormally work of the thermal control device in spacecraft during the thermal vacuum test. In order to avoid thermal vacuum test failure, level adjusting mechanism system is developed in the KM8 large space environment simulator as one of the most important subsystems. According to the level adjusting requirements of spacecraft’s thermal vacuum tests, the four fulcrums adjusting model is established. By means of collecting level instruments and displacement sensors data, stepping motors controlled by PLC drive four supporting legs simultaneous movement. In addition, a PID algorithm is used to control the temperature of supporting legs and level instruments which long time work under the vacuum cold and black environment in KM8 large space environment simulator during thermal vacuum tests. Based on the above methods, the data acquisition and processing, the analysis and calculation, real time adjustment and fault alarming of the level adjusting mechanism system are implemented. The level adjusting accuracy reaches 1mm/m, and carrying capacity is 20 tons. Debugging showed that the level adjusting mechanism system of KM8 large space environment simulator can meet the thermal vacuum test requirement of the new generation spacecraft. The performance and technical indicators of the level adjusting mechanism system which provides important support for the development of spacecraft in China have been ahead of similar equipment in the world.

Keywords: space environment simulator, thermal vacuum test, level adjusting, spacecraft, parallel mechanism

Procedia PDF Downloads 247
1064 Potential Use of Leaching Gravel as a Raw Material in the Preparation of Geo Polymeric Material as an Alternative to Conventional Cement Materials

Authors: Arturo Reyes Roman, Daniza Castillo Godoy, Francisca Balarezo Olivares, Francisco Arriagada Castro, Miguel Maulen Tapia

Abstract:

Mining waste–based geopolymers are a sustainable alternative to conventional cement materials due to their contribution to the valorization of mining wastes as well as to the new construction materials with reduced fingerprints. The objective of this study was to determine the potential of leaching gravel (LG) from hydrometallurgical copper processing to be used as a raw material in the manufacture of geopolymer. NaOH, Na2SiO3 (modulus 1.5), and LG were mixed and then wetted with an appropriate amount of tap water, then stirred until a homogenous paste was obtained. A liquid/solid ratio of 0.3 was used for preparing mixtures. The paste was then cast in cubic moulds of 50 mm for the determination of compressive strengths. The samples were left to dry for 24h at room temperature, then unmoulded before analysis after 28 days of curing time. The compressive test was conducted in a compression machine (15/300 kN). According to the laser diffraction spectroscopy (LDS) analysis, 90% of LG particles were below 500 μm. The X-ray diffraction (XRD) analysis identified crystalline phases of albite (30 %), Quartz (16%), Anorthite (16 %), and Phillipsite (14%). The X-ray fluorescence (XRF) determinations showed mainly 55% of SiO2, 13 % of Al2O3, and 9% of CaO. ICP (OES) concentrations of Fe, Ca, Cu, Al, As, V, Zn, Mo, and Ni were 49.545; 24.735; 6.172; 14.152, 239,5; 129,6; 41,1;15,1, and 13,1 mg kg-1, respectively. The geopolymer samples showed resistance ranging between 2 and 10 MPa. In comparison with the raw material composition, the amorphous percentage of materials in the geopolymer was 35 %, whereas the crystalline percentage of main mineral phases decreased. Further studies are needed to find the optimal combinations of materials to produce a more resistant and environmentally safe geopolymer. Particularly are necessary compressive resistance higher than 15 MPa are necessary to be used as construction unit such as bricks.

Keywords: mining waste, geopolymer, construction material, alkaline activation

Procedia PDF Downloads 94
1063 Experimental Monitoring of the Parameters of the Ionosphere in the Local Area Using the Results of Multifrequency GNSS-Measurements

Authors: Andrey Kupriyanov

Abstract:

In recent years, much attention has been paid to the problems of ionospheric disturbances and their influence on the signals of global navigation satellite systems (GNSS) around the world. This is due to the increase in solar activity, the expansion of the scope of GNSS, the emergence of new satellite systems, the introduction of new frequencies and many others. The influence of the Earth's ionosphere on the propagation of radio signals is an important factor in many applied fields of science and technology. The paper considers the application of the method of transionospheric sounding using measurements from signals from Global Navigation Satellite Systems to determine the TEC distribution and scintillations of the ionospheric layers. To calculate these parameters, the International Reference Ionosphere (IRI) model of the ionosphere, refined in the local area, is used. The organization of operational monitoring of ionospheric parameters is analyzed using several NovAtel GPStation6 base stations. It allows performing primary processing of GNSS measurement data, calculating TEC and fixing scintillation moments, modeling the ionosphere using the obtained data, storing data and performing ionospheric correction in measurements. As a result of the study, it was proved that the use of the transionospheric sounding method for reconstructing the altitude distribution of electron concentration in different altitude range and would provide operational information about the ionosphere, which is necessary for solving a number of practical problems in the field of many applications. Also, the use of multi-frequency multisystem GNSS equipment and special software will allow achieving the specified accuracy and volume of measurements.

Keywords: global navigation satellite systems (GNSS), GPstation6, international reference ionosphere (IRI), ionosphere, scintillations, total electron content (TEC)

Procedia PDF Downloads 181
1062 Towards Binder-Free and Self Supporting Flexible Supercapacitor from Carbon Nano-Onions and Their Composite with CuO Nanoparticles

Authors: Debananda Mohapatra, Subramanya Badrayyana, Smrutiranjan Parida

Abstract:

Recognizing the upcoming era of carbon nanostructures and their revolutionary applications, we investigated the formation and supercapacitor application of highly pure and hydrophilic carbon nano-onions (CNOs) by economical one-step flame-synthesis procedure. The facile and scalable method uses easily available organic carbon source such as clarified butter, restricting the use of any catalyst, sophisticated instrumentation, high vacuum and post processing purification procedure. The active material was conformally coated onto a locally available cotton wipe by “sonicating and drying” process to obtain novel, lightweight, inexpensive, flexible, binder-free electrodes with strong adhesion between nanoparticles and porous wipe. This interesting electrode with CNO as the active material delivers a specific capacitance of 102.16 F/g, the energy density of 14.18 Wh/kg and power density of 2448 W/kg which are the highest values reported so far in symmetrical two electrode cell configuration with 1M Na2SO4 as an electrolyte. Incorporation of CuO nanoparticles to these functionalized CNOs by one-step hydrothermal method add up to a significant specific capacitance of 420 F/g with deliverable energy and power density at 58.33 Wh/kg and 4228 W/kg, respectively. The free standing CNOs, as well as CNO-CuO composite electrode, showed an excellent cyclic performance and stability retaining 95 and 90% initial capacitance even after 5000 charge-discharge cycles at a current density of 5 A/g. This work presents a new platform for high performance supercapacitors for next generation wearable electronic devices.

Keywords: binder-free, flame synthesis, flexible, carbon nano-onion

Procedia PDF Downloads 197
1061 To Design an Architectural Model for On-Shore Oil Monitoring Using Wireless Sensor Network System

Authors: Saurabh Shukla, G. N. Pandey

Abstract:

In recent times, oil exploration and monitoring in on-shore areas have gained much importance considering the fact that in India the oil import is 62 percent of the total imports. Thus, architectural model like wireless sensor network to monitor on-shore deep sea oil well is being developed to get better estimate of the oil prospects. The problem we are facing nowadays that we have very few restricted areas of oil left today. Countries like India don’t have much large areas and resources for oil and this problem with most of the countries that’s why it has become a major problem when we are talking about oil exploration in on-shore areas also the increase of oil prices has further ignited the problem. For this the use of wireless network system having relative simplicity, smallness in size and affordable cost of wireless sensor nodes permit heavy deployment in on-shore places for monitoring oil wells. Deployment of wireless sensor network in large areas will surely reduce the cost it will be very much cost effective. The objective of this system is to send real time information of oil monitoring to the regulatory and welfare authorities so that suitable action could be taken. This system architecture is composed of sensor network, processing/transmission unit and a server. This wireless sensor network system could remotely monitor the real time data of oil exploration and monitoring condition in the identified areas. For wireless sensor networks, the systems are wireless, have scarce power, are real-time, utilize sensors and actuators as interfaces, have dynamically changing sets of resources, aggregate behaviour is important and location is critical. In this system a communication is done between the server and remotely placed sensors. The server gives the real time oil exploration and monitoring conditions to the welfare authorities.

Keywords: sensor, wireless sensor network, oil, sensor, on-shore level

Procedia PDF Downloads 446
1060 A Review on Valorisation of Chicken Feathers: Current Status and Future Prospects

Authors: Tamrat Tesfaye, Bruce Sithole, Deresh Ramjugernath

Abstract:

Worldwide, the poultry–processing industry generates large quantities of feather by-products that amount to 40 billion kilograms annually. The feathers are considered wastes although small amounts are often processed into valuable products such as feather meal and fertilizers. The remaining waste is disposed of by incineration or by burial in controlled landfills. Improper disposal of these biological wastes contributes to environmental damage and transmission of diseases. Economic pressures, environmental pressures, increasing interest in using renewable and sustainable raw materials, and the need to decrease reliance on non-renewable petroleum resources behove the industry to find better ways of dealing with waste feathers. A closer look at the structure and composition of feathers shows that the whole part of a chicken feather (rachis and barb) can be used as a source of a pure structural protein called keratin which can be exploited for conversion into a number of high-value bio products. Additionally, a number of technologies can be used to convert other biological components of feathers into high value added products. Thus, conversion of the waste into valuable products can make feathers an attractive raw material for the production of bio products. In this review, possible applications of chicken feathers in a variety of technologies and products are discussed. Thus, using waste feathers as a valuable resource can help the poultry industry to dispose of the waste feathers in an environmentally sustainable manner that also generates extra income for the industry. Their valorisation can result in their sustainable conversion into high-value materials and products on the proviso of existence or development of cost-effective technologies for converting this waste into the useful products.

Keywords: biodegradable product, keratin, poultry waste, feathers, valorisation

Procedia PDF Downloads 296
1059 Some Analytical Characteristics of Red Raspberry Jams

Authors: Cristina Damian, Eduard Malcek, Ana Leahu, Sorina Ropciuc, Andrei Lobiuc

Abstract:

Given the high rivalry nowadays, the food sector must offer the markets an attractive product, which at the same time has good quality and is safe from health aspects for the consumers. Known for their high content of antioxidant compounds, especially anthocyanins, which proven human health benefits, berries from the Rosaceae family plants have a significantly high level of phytochemicals: phenolic flavonoids, such as anthocyanins, ellagic acid (tannin), quercetin, gallic acid, cyanidin, pelargonidine, catechins, kaempferol and salicylic acid. Colour and bioactive compounds, such as vitamin C and anthocyanins, are important for the attractiveness of berries and their preserved products. The levels of bioactive compounds and sensory properties of the product as it reaches the consumer are dependent on raw material, i.e., berries used, processing, and storage conditions. In this study, four varieties of raspberry jam were analyzed, 3 of them purchased commercially; they were purchased at reasonable prices, precisely to include as large a sample of the consumer population as possible. The fourth assortment was made at home according to the traditional recipe without the addition of sweeteners or preservatives. As for the homemade red raspberry jam, it had a sugar concentration of 64.9%, being the most appreciated of all assortments. The homemade raspberry jam was most appreciated due to the taste and aroma of the product. The SCHWARTAU assortment was chosen in second place by the participants in the study (sensory analysis). The quality/price ratio is also valid this time, finding that a high-quality product will have a higher purchase price. Thus, the study had the role of presenting the preferences of the sample participating in the study by age categories.

Keywords: red raspberry, jam, antioxidant, colour, sensory analysis

Procedia PDF Downloads 10
1058 Fight against Money Laundering with Optical Character Recognition

Authors: Saikiran Subbagari, Avinash Malladhi

Abstract:

Anti Money Laundering (AML) regulations are designed to prevent money laundering and terrorist financing activities worldwide. Financial institutions around the world are legally obligated to identify, assess and mitigate the risks associated with money laundering and report any suspicious transactions to governing authorities. With increasing volumes of data to analyze, financial institutions seek to automate their AML processes. In the rise of financial crimes, optical character recognition (OCR), in combination with machine learning (ML) algorithms, serves as a crucial tool for automating AML processes by extracting the data from documents and identifying suspicious transactions. In this paper, we examine the utilization of OCR for AML and delve into various OCR techniques employed in AML processes. These techniques encompass template-based, feature-based, neural network-based, natural language processing (NLP), hidden markov models (HMMs), conditional random fields (CRFs), binarizations, pattern matching and stroke width transform (SWT). We evaluate each technique, discussing their strengths and constraints. Also, we emphasize on how OCR can improve the accuracy of customer identity verification by comparing the extracted text with the office of foreign assets control (OFAC) watchlist. We will also discuss how OCR helps to overcome language barriers in AML compliance. We also address the implementation challenges that OCR-based AML systems may face and offer recommendations for financial institutions based on the data from previous research studies, which illustrate the effectiveness of OCR-based AML.

Keywords: anti-money laundering, compliance, financial crimes, fraud detection, machine learning, optical character recognition

Procedia PDF Downloads 144
1057 Status of Participative Governance Practices in Higher Education: Implications for Stakeholders' Transformative Role-Assumption

Authors: Endalew Fufa Kufi

Abstract:

The research investigated the role of stakeholders such as students, teachers and administrators in the practices of good governance in higher education by looking into the special contributions of top-officials, teachers and students in ensuring workable ties and productive interchanges in Adama Science and Technology University. Attention was given to participation, fairness and exemplariness as key indicators of good governance. The target university was chosen for its familiarity for the researcher to get dependable data, access to respondent and management of the processing of data. Descriptive survey design was used for the purpose of describing concerned roles the stakeholders in the university governance in order to reflect on the nature of participation of the practices. Centres of the research were administration where supportive groups such as central administrators and underlying service-givers had parts and academia where teachers and students were target. Generally, 60 teachers, 40 students and 15 administrative officers were referents. Data were collected in the form of self-report through open-ended questionnaires. The findings indicated that, while vertical interchanges in terms of academic and administrative routines were had normal flow on top-down basis, planned practices of stakeholders in decision-making and reasonably communicating roles and changes in decisions with top-officials were not efficiently practiced. Moreover, the practices of good modelling were not witnessed to have existed to the fullest extent. Rather, existence of a very wide gap between the academic and administrative staffs was witnessed as was reflected the case between teachers and students. The implication was such that for shortage in participative atmosphere and weaning of fairness in governance, routine practices have been there as the vicious circles of governance.

Keywords: governance, participative, stakeholders, transformative, role-assumption

Procedia PDF Downloads 398
1056 Fast Switching Mechanism for Multicasting Failure in OpenFlow Networks

Authors: Alaa Allakany, Koji Okamura

Abstract:

Multicast technology is an efficient and scalable technology for data distribution in order to optimize network resources. However, in the IP network, the responsibility for management of multicast groups is distributed among network routers, which causes some limitations such as delays in processing group events, high bandwidth consumption and redundant tree calculation. Software Defined Networking (SDN) represented by OpenFlow presented as a solution for many problems, in SDN the control plane and data plane are separated by shifting the control and management to a remote centralized controller, and the routers are used as a forwarder only. In this paper we will proposed fast switching mechanism for solving the problem of link failure in multicast tree based on Tabu Search heuristic algorithm and modifying the functions of OpenFlow switch to fasts switch to the pack up sub tree rather than sending to the controller. In this work we will implement multicasting OpenFlow controller, this centralized controller is a core part in our multicasting approach, which is responsible for 1- constructing the multicast tree, 2- handling the multicast group events and multicast state maintenance. And finally modifying OpenFlow switch functions for fasts switch to pack up paths. Forwarders, forward the multicast packet based on multicast routing entries which were generated by the centralized controller. Tabu search will be used as heuristic algorithm for construction near optimum multicast tree and maintain multicast tree to still near optimum in case of join or leave any members from multicast group (group events).

Keywords: multicast tree, software define networks, tabu search, OpenFlow

Procedia PDF Downloads 263
1055 Beneficiation of Pulp and Paper Mill Sludge for the Generation of Single Cell Protein for Fish Farming

Authors: Lucretia Ramnath

Abstract:

Fishmeal is extensively used for fish farming but is an expensive fish feed ingredient. A cheaper alternate to fishmeal is single cell protein (SCP) which can be cultivated on fermentable sugars recovered from organic waste streams such as pulp and paper mill sludge (PPMS). PPMS has a high cellulose content, thus is suitable for glucose recovery through enzymatic hydrolysis but is hampered by lignin and ash. To render PPMS amenable for enzymatic hydrolysis, the PPMS waspre-treated to produce a glucose-rich hydrolysate which served as a feed stock for the production of fungal SCP. The PPMS used in this study had the following composition: 72.77% carbohydrates, 8.6% lignin, and 18.63% ash. The pre-treatments had no significant effect on lignin composition but had a substantial effect on carbohydrate and ash content. Enzymatic hydrolysis of screened PPMS was previously optimized through response surface methodology (RSM) and 2-factorial design. The optimized protocol resulted in a hydrolysate containing 46.1 g/L of glucose, of which 86% was recovered after downstream processing by passing through a 100-mesh sieve (38 µm pore size). Vogel’s medium supplemented with 10 g/L hydrolysate successfully supported the growth of Fusarium venenatum, conducted using standard growth conditions; pH 6, 200 rpm, 2.88 g/L ammonium phosphate, 25°C. A maximum F. venenatum biomass of 45 g/L was produced with a yield coefficient of 4.67. Pulp and paper mill sludge hydrolysate contained approximately five times more glucose than what was needed for SCP production and served as a suitable carbon source. We have shown that PPMS can be successfully beneficiated for SCP production.

Keywords: pulp and paper waste, fungi, single cell protein, hydrolysate

Procedia PDF Downloads 207
1054 Automated Fact-Checking by Incorporating Contextual Knowledge and Multi-Faceted Search

Authors: Wenbo Wang, Yi-Fang Brook Wu

Abstract:

The spread of misinformation and disinformation has become a major concern, particularly with the rise of social media as a primary source of information for many people. As a means to address this phenomenon, automated fact-checking has emerged as a safeguard against the spread of misinformation and disinformation. Existing fact-checking approaches aim to determine whether a news claim is true or false, and they have achieved decent veracity prediction accuracy. However, the state-of-the-art methods rely on manually verified external information to assist the checking model in making judgments, which requires significant human resources. This study introduces a framework, SAC, which focuses on 1) augmenting the representation of a claim by incorporating additional context using general-purpose, comprehensive, and authoritative data; 2) developing a search function to automatically select relevant, new, and credible references; 3) focusing on the important parts of the representations of a claim and its reference that are most relevant to the fact-checking task. The experimental results demonstrate that 1) Augmenting the representations of claims and references through the use of a knowledge base, combined with the multi-head attention technique, contributes to improved performance of fact-checking. 2) SAC with auto-selected references outperforms existing fact-checking approaches with manual selected references. Future directions of this study include I) exploring knowledge graphs in Wikidata to dynamically augment the representations of claims and references without introducing too much noise, II) exploring semantic relations in claims and references to further enhance fact-checking.

Keywords: fact checking, claim verification, deep learning, natural language processing

Procedia PDF Downloads 62
1053 Experimental Investigation of the Effect of Glass Granulated Blast Furnace Slag on Pavement Quality Concrete Pavement Made of Recycled Asphalt Pavement Material

Authors: Imran Altaf Wasil, Dinesh Ganvir

Abstract:

Due to a scarcity of virgin aggregates, the use of reclaimed asphalt pavement (RAP) as a substitute for natural aggregates has gained popularity. Despite the fact that RAP is recycled in asphalt pavement, there is still excess RAP, and its use in concrete pavements has expanded in recent years. According to a survey, 98 percent of India's pavements are flexible. As a result, the maintenance and reconstruction of such pavements generate RAP, which can be reused in concrete pavements as well as surface course, base course, and sub-base of flexible pavements. Various studies on the properties of reclaimed asphalt pavement and its optimal requirements for usage in concrete has been conducted throughout the years. In this study a total of four different mixes were prepared by partially replacing natural aggregates by RAP in different proportions. It was found that with the increase in the replacement level of Natural aggregates by RAP the mechanical and durability properties got reduced. In order to increase the mechanical strength of mixes 40% Glass Granulated Blast Furnace Slag (GGBS) was used and it was found that with replacement of cement by 40% of GGBS, there was an enhancement in the mechanical and durability properties of RAP inclusive PQC mixes. The reason behind the improvement in the properties is due to the processing technique used in order to remove the contaminant layers present in the coarse RAP aggregates. The replacement level of Natural aggregate with RAP was done in proportions of 20%, 40% and 60% along with the partial replacement of cement by 40% GGBS. It was found that all the mixes surpassed the design target value of 40 MPa in compression and 4.5 MPa in flexure making it much more economical and feasible.

Keywords: reclaimed asphalt pavement, pavement quality concrete, glass granulated blast furnace slag, mechanical and durability properties

Procedia PDF Downloads 116
1052 Effect of Marketing Strategy on the Performance of Small and Medium Enterprises in Nigeria

Authors: Kadiri Kayode Ibrahim, Kadiri Omowunmi

Abstract:

The research study was concerned with an evaluation of the effect of marketing strategy on the performance of SMEs in Abuja. This was achieved, specifically, through the examination of the effect of disaggregated components of Marketing Strategy (Product, Price, Promotion, Placement and Process) on Sales Volume (as a proxy for performance). The study design was causal in nature, with the use of quantitative methods involving a cross-sectional survey carried out with the administration of a structured questionnaire. A multistage sample of 398 respondents was utilized to provide the primary data used in the study. Subsequently, path analysis was employed in processing the obtained data and testing formulated hypotheses. Findings from the study indicated that all modeled components of marketing strategy were positive and statistically significant determinants of performance among businesses in the zone. It was, therefore, recommended that SMEs invest in continuous product innovation and development that are in line with the needs and preferences of the target market, as well as adopt a dynamic pricing strategy that considers both cost factors and market conditions. It is, therefore, crucial that businesses in the zone adopt marker communication measures that would stimulate brand awareness and increase engagement, including the use of social media platforms and content marketing. Additionally, owner-managers should ensure that their products are readily available to their target customers through an emphasis on availability and accessibility measures. Furthermore, a commitment to consistent optimization of internal operations is crucial for improved productivity, reduced costs, and enhanced customer satisfaction, which in turn will positively impact their overall performance.

Keywords: product, price, promotion, placement

Procedia PDF Downloads 43
1051 StockTwits Sentiment Analysis on Stock Price Prediction

Authors: Min Chen, Rubi Gupta

Abstract:

Understanding and predicting stock market movements is a challenging problem. It is believed stock markets are partially driven by public sentiments, which leads to numerous research efforts to predict stock market trend using public sentiments expressed on social media such as Twitter but with limited success. Recently a microblogging website StockTwits is becoming increasingly popular for users to share their discussions and sentiments about stocks and financial market. In this project, we analyze the text content of StockTwits tweets and extract financial sentiment using text featurization and machine learning algorithms. StockTwits tweets are first pre-processed using techniques including stopword removal, special character removal, and case normalization to remove noise. Features are extracted from these preprocessed tweets through text featurization process using bags of words, N-gram models, TF-IDF (term frequency-inverse document frequency), and latent semantic analysis. Machine learning models are then trained to classify the tweets' sentiment as positive (bullish) or negative (bearish). The correlation between the aggregated daily sentiment and daily stock price movement is then investigated using Pearson’s correlation coefficient. Finally, the sentiment information is applied together with time series stock data to predict stock price movement. The experiments on five companies (Apple, Amazon, General Electric, Microsoft, and Target) in a duration of nine months demonstrate the effectiveness of our study in improving the prediction accuracy.

Keywords: machine learning, sentiment analysis, stock price prediction, tweet processing

Procedia PDF Downloads 156
1050 Accessibility Assessment of School Facilities Using Geospatial Technologies: A Case Study of District Sheikhupura

Authors: Hira Jabbar

Abstract:

Education is vital for inclusive growth of an economy and a critical contributor for investment in human capital. Like other developing countries, Pakistan is facing enormous challenges regarding the provision of public facilities, improper infrastructure planning, accelerating rate of population and poor accessibility. The influence of the rapid advancement and innovations in GIS and RS techniques have proved to be a useful tool for better planning and decision making to encounter these challenges. Therefore present study incorporates GIS and RS techniques to investigate the spatial distribution of school facilities, identifies settlements with served and unserved population, finds potential areas for new schools based on population and develops an accessibility index to evaluate the higher accessibility for schools. For this purpose high-resolution worldview imagery was used to develop road network, settlements and school facilities and to generate school accessibility for each level. Landsat 8 imagery was utilized to extract built-up area by applying pre and post-processing models and Landscan 2015 was used to analyze population statistics. Service area analysis was performed using network analyst extension in ArcGIS 10.3v and results were evaluated for served and underserved areas and population. An accessibility tool was used to evaluate a set of potential destinations to determine which is the most accessible with the given population distribution. Findings of the study may contribute to facilitating the town planners and education authorities for understanding the existing patterns of school facilities. It is concluded that GIS and remote sensing can be effectively used in urban transport and facility planning.

Keywords: accessibility, geographic information system, landscan, worldview

Procedia PDF Downloads 325
1049 Marketing Parameters on Consumer's Perceptions of Farmed Sea Bass in Greece

Authors: Sophia Anastasiou, Cosmas Nathanailides, Fotini Kakali, Kostas Karipoglou

Abstract:

Wild fish are considered as testier and in fish restaurants are offered at twice the price of farmed fish. Several chemical and structural differences can affect the consumer's attitudes for farmed fish. The structure and chemical composition of fish muscle is also important for the performance of farmed fish during handling, storage and processing. In the present work we present the chemical and sensory parameters which are used as indicators of fish flesh quality and we investigated the perceptions of consumers for farmed sea bass and the organoleptic differences between samples of wild and farmed sea bass. A questionnaire was distributed to a group of various ages that were regular consumers of sea bass. The questionnaire included a survey on the perceptions on taste and appearance differences between wild and farmed sea bass. A significant percentage (>40%) of the participants stated their perception of superior taste of wild sea bass versus the farmed fish. The participants took part in an organoleptic assessment of wild and farmed sea bass prepared and cooked by a local fish restaurant. Portions were evaluated for intensity of sensorial attributes from 1 (low intensity) to 5 (high intensity). The results indicate that contrary to the assessor's perception, farmed sea bass scored better in al organoleptic parameters assessed with marked superiority in texture and taste over the wild sea bass. This research has been co-financed by the European Union (European Social Fund – ESF) and Greek national funds through the Operational Program "Education and Lifelong Learning" of the National Strategic Reference Framework (NSRF) - Research Funding Program: ARCHIMEDES III. Investing in knowledge society through the European Social Fund.

Keywords: fish marketing, farmed fish, seafood quality, wild fish

Procedia PDF Downloads 403
1048 A Two Server Poisson Queue Operating under FCFS Discipline with an ‘m’ Policy

Authors: R. Sivasamy, G. Paulraj, S. Kalaimani, N.Thillaigovindan

Abstract:

For profitable businesses, queues are double-edged swords and hence the pain of long wait times in a queue often frustrates customers. This paper suggests a technical way of reducing the pain of lines through a Poisson M/M1, M2/2 queueing system operated by two heterogeneous servers with an objective of minimising the mean sojourn time of customers served under the queue discipline ‘First Come First Served with an ‘m’ policy, i.e. FCFS-m policy’. Arrivals to the system form a Poisson process of rate λ and are served by two exponential servers. The service times of successive customers at server ‘j’ are independent and identically distributed (i.i.d.) random variables and each of it is exponentially distributed with rate parameter μj (j=1, 2). The primary condition for implementing the queue discipline ‘FCFS-m policy’ on these service rates μj (j=1, 2) is that either (m+1) µ2 > µ1> m µ2 or (m+1) µ1 > µ2> m µ1 must be satisfied. Further waiting customers prefer the server-1 whenever it becomes available for service, and the server-2 should be installed if and only if the queue length exceeds the value ‘m’ as a threshold. Steady-state results on queue length and waiting time distributions have been obtained. A simple way of tracing the optimal service rate μ*2 of the server-2 is illustrated in a specific numerical exercise to equalize the average queue length cost with that of the service cost. Assuming that the server-1 has to dynamically adjust the service rates as μ1 during the system size is strictly less than T=(m+2) while μ2=0, and as μ1 +μ2 where μ2>0 if the system size is more than or equal to T, corresponding steady state results of M/M1+M2/1 queues have been deduced from those of M/M1,M2/2 queues. To conclude this investigation has a viable application, results of M/M1+M2/1 queues have been used in processing of those waiting messages into a single computer node and to measure the power consumption by the node.

Keywords: two heterogeneous servers, M/M1, M2/2 queue, service cost and queue length cost, M/M1+M2/1 queue

Procedia PDF Downloads 362
1047 Innovative Waste Management Practices in Remote Areas

Authors: Dolores Hidalgo, Jesús M. Martín-Marroquín, Francisco Corona

Abstract:

Municipal waste consist of a variety of items that are everyday discarded by the population. They are usually collected by municipalities and include waste generated by households, commercial activities (local shops) and public buildings. The composition of municipal waste varies greatly from place to place, being mostly related to levels and patterns of consumption, rates of urbanization, lifestyles, and local or national waste management practices. Each year, a huge amount of resources is consumed in the EU, and according to that, also a huge amount of waste is produced. The environmental problems derived from the management and processing of these waste streams are well known, and include impacts on land, water and air. The situation in remote areas is even worst. Difficult access when climatic conditions are adverse, remoteness of centralized municipal treatment systems or dispersion of the population, are all factors that make remote areas a real municipal waste treatment challenge. Furthermore, the scope of the problem increases significantly because the total lack of awareness of the existing risks in this area together with the poor implementation of advanced culture on waste minimization and recycling responsibly. The aim of this work is to analyze the existing situation in remote areas in reference to the production of municipal waste and evaluate the efficiency of different management alternatives. Ideas for improving waste management in remote areas include, for example: the implementation of self-management systems for the organic fraction; establish door-to-door collection models; promote small-scale treatment facilities or adjust the rates of waste generation thereof.

Keywords: door to door collection, islands, isolated areas, municipal waste, remote areas, rural communities

Procedia PDF Downloads 260
1046 Efficiency of PCR-RFLP for the Identification of Adulteries in Meat Formulation

Authors: Hela Gargouri, Nizar Moalla, Hassen Hadj Kacem

Abstract:

Meat adulteration affecting the safety and quality of food is becoming one of the main concerns of public interest across the world. The drastic consequences on the meat industry highlighted the urgent necessity to control the products' quality and to point out the complexity of both supply and processing circuits. Due to the expansion of this problem, the authentic testing of foods, particularly meat and its products, is deemed crucial to avoid unfair market competition and to protect consumers from fraudulent practices of meat adulteration. The adoption of authentication methods by the food quality-control laboratories is becoming a priority issue. However, in some developing countries, the number of food tests is still insignificant, although a variety of processed and traditional meat products are widely consumed. Little attention has been paid to provide an easy, fast, reproducible, and low-cost molecular test, which could be conducted in a basic laboratory. In the current study, the 359 bp fragment of the cytochrome-b gene was mapped by PCR-RFLP using firstly fresh biological supports (DNA and meat) and then turkey salami as an example of commercial processed meat. This technique has been established through several optimizations, namely: the selection of restriction enzymes. The digestion with BsmAI, SspI, and TaaI succeed to identify the seven included animal species when meat is formed by individual species and when the meat is a mixture of different origin. In this study, the PCR-RFLP technique using universal primer succeed to meet our needs by providing an indirect sequencing method identifying by restriction enzymes the specificities characterizing different species on the same amplicon reducing the number of potential tests.

Keywords: adulteration, animal species, authentication, meat, mtDNA, PCR-RFLP

Procedia PDF Downloads 112
1045 Designing Agricultural Irrigation Systems Using Drone Technology and Geospatial Analysis

Authors: Yongqin Zhang, John Lett

Abstract:

Geospatial technologies have been increasingly used in agriculture for various applications and purposes in recent years. Unmanned aerial vehicles (drones) fit the needs of farmers in farming operations, from field spraying to grow cycles and crop health. In this research, we conducted a practical research project that used drone technology to design and map optimal locations and layouts of irrigation systems for agriculture farms. We flew a DJI Mavic 2 Pro drone to acquire aerial remote sensing images over two agriculture fields in Forest, Mississippi, in 2022. Flight plans were first designed to capture multiple high-resolution images via a 20-megapixel RGB camera mounted on the drone over the agriculture fields. The Drone Deploy web application was then utilized to develop flight plans and subsequent image processing and measurements. The images were orthorectified and processed to estimate the area of the area and measure the locations of the water line and sprinkle heads. Field measurements were conducted to measure the ground targets and validate the aerial measurements. Geospatial analysis and photogrammetric measurements were performed for the study area to determine optimal layout and quantitative estimates for irrigation systems. We created maps and tabular estimates to demonstrate the locations, spacing, amount, and layout of sprinkler heads and water lines to cover the agricultural fields. This research project provides scientific guidance to Mississippi farmers for a precision agricultural irrigation practice.

Keywords: drone images, agriculture, irrigation, geospatial analysis, photogrammetric measurements

Procedia PDF Downloads 76