Search results for: melt processing
1056 The Weavability of Waste Plants and Their Application in Fashion and Textile Design
Authors: Jichi Wu
Abstract:
The dwindling of resources requires a more sustainable design. New technology could bring new materials and processing techniques to the fashion industry and push it to a more sustainable future. Thus this paper explores cutting-edge researches on the life-cycle of closed-loop products and aims to find innovative ways to recycle and upcycle. For such a goal, the author investigated how low utilization plants and leftover fiber could be turned into ecological textiles in fashion. Through examining the physical and chemical properties (cellulose content/ fiber form) of ecological textiles to explore their wearability, this paper analyzed the prospect of bio-fabrics (weavable plants) in body-oriented fashion design and their potential in sustainable fashion and textile design. By extracting cellulose from 9 different types or sections of plants, the author intends to find an appropriate method (such as ion solution extraction) to mostly increase the weavability of plants, so raw materials could be more effectively changed into fabrics. All first-hand experiment data were carefully collected and then analyzed under the guidance of related theories. The result of the analysis was recorded in detail and presented in an understandable way. Various research methods are adopted through this project, including field trip and experiments to make comparisons and recycle materials. Cross-discipline cooperation is also conducted for related knowledge and theories. From this, experiment data will be collected, analyzed, and interpreted into a description and visualization results. Based on the above conclusions, it is possible to apply weavable plant fibres to develop new textile and fashion.Keywords: wearable bio-textile, sustainability, economy, ecology, technology, weavability, fashion design
Procedia PDF Downloads 1461055 The Effect of Main Factors on Forces during FSJ Processing of AA2024 Aluminum
Authors: Dunwen Zuo, Yongfang Deng, Bo Song
Abstract:
An attempt is made here to measure the forces of three directions, under conditions of different feed speeds, different tilt angles of tool and without or with the pin on the tool, by using octagonal ring dynamometer in the AA2024 aluminum FSJ (Friction Stir Joining) process, and investigate how four main factors influence forces in the FSJ process. It is found that, high feed speed lead to small feed force and small lateral force, but high feed speed leads to large feed force in the stable joining stage of process. As the rotational speed increasing, the time of axial force drop from the maximum to the minimum required increased in the push-up process. In the stable joining stage, the rotational speed has little effect on the feed force; large rotational speed leads to small lateral force and axial force. The maximum axial force increases as the tilt angle of tool increases at the downward movement stage. At the moment of start feeding, as tilt angle of tool increases, the amplitudes of the axial force increasing become large. In the stable joining stage, with the increase of tilt angle of tool, the axial force is increased, the lateral force is decreased, and the feed force almost unchanged. The tool with pin will decrease axial force in the downward movement stage. The feed force and lateral force will increase, but the axial force will reduced in the stable joining stage by using the tool with pin compare to by using the tool without pin.Keywords: FSJ, force factor, AA2024 aluminum, friction stir joining
Procedia PDF Downloads 4871054 Memory Based Reinforcement Learning with Transformers for Long Horizon Timescales and Continuous Action Spaces
Authors: Shweta Singh, Sudaman Katti
Abstract:
The most well-known sequence models make use of complex recurrent neural networks in an encoder-decoder configuration. The model used in this research makes use of a transformer, which is based purely on a self-attention mechanism, without relying on recurrence at all. More specifically, encoders and decoders which make use of self-attention and operate based on a memory, are used. In this research work, results for various 3D visual and non-visual reinforcement learning tasks designed in Unity software were obtained. Convolutional neural networks, more specifically, nature CNN architecture, are used for input processing in visual tasks, and comparison with standard long short-term memory (LSTM) architecture is performed for both visual tasks based on CNNs and non-visual tasks based on coordinate inputs. This research work combines the transformer architecture with the proximal policy optimization technique used popularly in reinforcement learning for stability and better policy updates while training, especially for continuous action spaces, which are used in this research work. Certain tasks in this paper are long horizon tasks that carry on for a longer duration and require extensive use of memory-based functionalities like storage of experiences and choosing appropriate actions based on recall. The transformer, which makes use of memory and self-attention mechanism in an encoder-decoder configuration proved to have better performance when compared to LSTM in terms of exploration and rewards achieved. Such memory based architectures can be used extensively in the field of cognitive robotics and reinforcement learning.Keywords: convolutional neural networks, reinforcement learning, self-attention, transformers, unity
Procedia PDF Downloads 1351053 Research on Level Adjusting Mechanism System of Large Space Environment Simulator
Authors: Han Xiao, Zhang Lei, Huang Hai, Lv Shizeng
Abstract:
Space environment simulator is a device for spacecraft test. KM8 large space environment simulator built in Tianjing Space City is the largest as well as the most advanced space environment simulator in China. Large deviation of spacecraft level will lead to abnormally work of the thermal control device in spacecraft during the thermal vacuum test. In order to avoid thermal vacuum test failure, level adjusting mechanism system is developed in the KM8 large space environment simulator as one of the most important subsystems. According to the level adjusting requirements of spacecraft’s thermal vacuum tests, the four fulcrums adjusting model is established. By means of collecting level instruments and displacement sensors data, stepping motors controlled by PLC drive four supporting legs simultaneous movement. In addition, a PID algorithm is used to control the temperature of supporting legs and level instruments which long time work under the vacuum cold and black environment in KM8 large space environment simulator during thermal vacuum tests. Based on the above methods, the data acquisition and processing, the analysis and calculation, real time adjustment and fault alarming of the level adjusting mechanism system are implemented. The level adjusting accuracy reaches 1mm/m, and carrying capacity is 20 tons. Debugging showed that the level adjusting mechanism system of KM8 large space environment simulator can meet the thermal vacuum test requirement of the new generation spacecraft. The performance and technical indicators of the level adjusting mechanism system which provides important support for the development of spacecraft in China have been ahead of similar equipment in the world.Keywords: space environment simulator, thermal vacuum test, level adjusting, spacecraft, parallel mechanism
Procedia PDF Downloads 2451052 Potential Use of Leaching Gravel as a Raw Material in the Preparation of Geo Polymeric Material as an Alternative to Conventional Cement Materials
Authors: Arturo Reyes Roman, Daniza Castillo Godoy, Francisca Balarezo Olivares, Francisco Arriagada Castro, Miguel Maulen Tapia
Abstract:
Mining waste–based geopolymers are a sustainable alternative to conventional cement materials due to their contribution to the valorization of mining wastes as well as to the new construction materials with reduced fingerprints. The objective of this study was to determine the potential of leaching gravel (LG) from hydrometallurgical copper processing to be used as a raw material in the manufacture of geopolymer. NaOH, Na2SiO3 (modulus 1.5), and LG were mixed and then wetted with an appropriate amount of tap water, then stirred until a homogenous paste was obtained. A liquid/solid ratio of 0.3 was used for preparing mixtures. The paste was then cast in cubic moulds of 50 mm for the determination of compressive strengths. The samples were left to dry for 24h at room temperature, then unmoulded before analysis after 28 days of curing time. The compressive test was conducted in a compression machine (15/300 kN). According to the laser diffraction spectroscopy (LDS) analysis, 90% of LG particles were below 500 μm. The X-ray diffraction (XRD) analysis identified crystalline phases of albite (30 %), Quartz (16%), Anorthite (16 %), and Phillipsite (14%). The X-ray fluorescence (XRF) determinations showed mainly 55% of SiO2, 13 % of Al2O3, and 9% of CaO. ICP (OES) concentrations of Fe, Ca, Cu, Al, As, V, Zn, Mo, and Ni were 49.545; 24.735; 6.172; 14.152, 239,5; 129,6; 41,1;15,1, and 13,1 mg kg-1, respectively. The geopolymer samples showed resistance ranging between 2 and 10 MPa. In comparison with the raw material composition, the amorphous percentage of materials in the geopolymer was 35 %, whereas the crystalline percentage of main mineral phases decreased. Further studies are needed to find the optimal combinations of materials to produce a more resistant and environmentally safe geopolymer. Particularly are necessary compressive resistance higher than 15 MPa are necessary to be used as construction unit such as bricks.Keywords: mining waste, geopolymer, construction material, alkaline activation
Procedia PDF Downloads 931051 Experimental Monitoring of the Parameters of the Ionosphere in the Local Area Using the Results of Multifrequency GNSS-Measurements
Authors: Andrey Kupriyanov
Abstract:
In recent years, much attention has been paid to the problems of ionospheric disturbances and their influence on the signals of global navigation satellite systems (GNSS) around the world. This is due to the increase in solar activity, the expansion of the scope of GNSS, the emergence of new satellite systems, the introduction of new frequencies and many others. The influence of the Earth's ionosphere on the propagation of radio signals is an important factor in many applied fields of science and technology. The paper considers the application of the method of transionospheric sounding using measurements from signals from Global Navigation Satellite Systems to determine the TEC distribution and scintillations of the ionospheric layers. To calculate these parameters, the International Reference Ionosphere (IRI) model of the ionosphere, refined in the local area, is used. The organization of operational monitoring of ionospheric parameters is analyzed using several NovAtel GPStation6 base stations. It allows performing primary processing of GNSS measurement data, calculating TEC and fixing scintillation moments, modeling the ionosphere using the obtained data, storing data and performing ionospheric correction in measurements. As a result of the study, it was proved that the use of the transionospheric sounding method for reconstructing the altitude distribution of electron concentration in different altitude range and would provide operational information about the ionosphere, which is necessary for solving a number of practical problems in the field of many applications. Also, the use of multi-frequency multisystem GNSS equipment and special software will allow achieving the specified accuracy and volume of measurements.Keywords: global navigation satellite systems (GNSS), GPstation6, international reference ionosphere (IRI), ionosphere, scintillations, total electron content (TEC)
Procedia PDF Downloads 1801050 Towards Binder-Free and Self Supporting Flexible Supercapacitor from Carbon Nano-Onions and Their Composite with CuO Nanoparticles
Authors: Debananda Mohapatra, Subramanya Badrayyana, Smrutiranjan Parida
Abstract:
Recognizing the upcoming era of carbon nanostructures and their revolutionary applications, we investigated the formation and supercapacitor application of highly pure and hydrophilic carbon nano-onions (CNOs) by economical one-step flame-synthesis procedure. The facile and scalable method uses easily available organic carbon source such as clarified butter, restricting the use of any catalyst, sophisticated instrumentation, high vacuum and post processing purification procedure. The active material was conformally coated onto a locally available cotton wipe by “sonicating and drying” process to obtain novel, lightweight, inexpensive, flexible, binder-free electrodes with strong adhesion between nanoparticles and porous wipe. This interesting electrode with CNO as the active material delivers a specific capacitance of 102.16 F/g, the energy density of 14.18 Wh/kg and power density of 2448 W/kg which are the highest values reported so far in symmetrical two electrode cell configuration with 1M Na2SO4 as an electrolyte. Incorporation of CuO nanoparticles to these functionalized CNOs by one-step hydrothermal method add up to a significant specific capacitance of 420 F/g with deliverable energy and power density at 58.33 Wh/kg and 4228 W/kg, respectively. The free standing CNOs, as well as CNO-CuO composite electrode, showed an excellent cyclic performance and stability retaining 95 and 90% initial capacitance even after 5000 charge-discharge cycles at a current density of 5 A/g. This work presents a new platform for high performance supercapacitors for next generation wearable electronic devices.Keywords: binder-free, flame synthesis, flexible, carbon nano-onion
Procedia PDF Downloads 1961049 To Design an Architectural Model for On-Shore Oil Monitoring Using Wireless Sensor Network System
Authors: Saurabh Shukla, G. N. Pandey
Abstract:
In recent times, oil exploration and monitoring in on-shore areas have gained much importance considering the fact that in India the oil import is 62 percent of the total imports. Thus, architectural model like wireless sensor network to monitor on-shore deep sea oil well is being developed to get better estimate of the oil prospects. The problem we are facing nowadays that we have very few restricted areas of oil left today. Countries like India don’t have much large areas and resources for oil and this problem with most of the countries that’s why it has become a major problem when we are talking about oil exploration in on-shore areas also the increase of oil prices has further ignited the problem. For this the use of wireless network system having relative simplicity, smallness in size and affordable cost of wireless sensor nodes permit heavy deployment in on-shore places for monitoring oil wells. Deployment of wireless sensor network in large areas will surely reduce the cost it will be very much cost effective. The objective of this system is to send real time information of oil monitoring to the regulatory and welfare authorities so that suitable action could be taken. This system architecture is composed of sensor network, processing/transmission unit and a server. This wireless sensor network system could remotely monitor the real time data of oil exploration and monitoring condition in the identified areas. For wireless sensor networks, the systems are wireless, have scarce power, are real-time, utilize sensors and actuators as interfaces, have dynamically changing sets of resources, aggregate behaviour is important and location is critical. In this system a communication is done between the server and remotely placed sensors. The server gives the real time oil exploration and monitoring conditions to the welfare authorities.Keywords: sensor, wireless sensor network, oil, sensor, on-shore level
Procedia PDF Downloads 4441048 A Review on Valorisation of Chicken Feathers: Current Status and Future Prospects
Authors: Tamrat Tesfaye, Bruce Sithole, Deresh Ramjugernath
Abstract:
Worldwide, the poultry–processing industry generates large quantities of feather by-products that amount to 40 billion kilograms annually. The feathers are considered wastes although small amounts are often processed into valuable products such as feather meal and fertilizers. The remaining waste is disposed of by incineration or by burial in controlled landfills. Improper disposal of these biological wastes contributes to environmental damage and transmission of diseases. Economic pressures, environmental pressures, increasing interest in using renewable and sustainable raw materials, and the need to decrease reliance on non-renewable petroleum resources behove the industry to find better ways of dealing with waste feathers. A closer look at the structure and composition of feathers shows that the whole part of a chicken feather (rachis and barb) can be used as a source of a pure structural protein called keratin which can be exploited for conversion into a number of high-value bio products. Additionally, a number of technologies can be used to convert other biological components of feathers into high value added products. Thus, conversion of the waste into valuable products can make feathers an attractive raw material for the production of bio products. In this review, possible applications of chicken feathers in a variety of technologies and products are discussed. Thus, using waste feathers as a valuable resource can help the poultry industry to dispose of the waste feathers in an environmentally sustainable manner that also generates extra income for the industry. Their valorisation can result in their sustainable conversion into high-value materials and products on the proviso of existence or development of cost-effective technologies for converting this waste into the useful products.Keywords: biodegradable product, keratin, poultry waste, feathers, valorisation
Procedia PDF Downloads 2941047 Some Analytical Characteristics of Red Raspberry Jams
Authors: Cristina Damian, Eduard Malcek, Ana Leahu, Sorina Ropciuc, Andrei Lobiuc
Abstract:
Given the high rivalry nowadays, the food sector must offer the markets an attractive product, which at the same time has good quality and is safe from health aspects for the consumers. Known for their high content of antioxidant compounds, especially anthocyanins, which proven human health benefits, berries from the Rosaceae family plants have a significantly high level of phytochemicals: phenolic flavonoids, such as anthocyanins, ellagic acid (tannin), quercetin, gallic acid, cyanidin, pelargonidine, catechins, kaempferol and salicylic acid. Colour and bioactive compounds, such as vitamin C and anthocyanins, are important for the attractiveness of berries and their preserved products. The levels of bioactive compounds and sensory properties of the product as it reaches the consumer are dependent on raw material, i.e., berries used, processing, and storage conditions. In this study, four varieties of raspberry jam were analyzed, 3 of them purchased commercially; they were purchased at reasonable prices, precisely to include as large a sample of the consumer population as possible. The fourth assortment was made at home according to the traditional recipe without the addition of sweeteners or preservatives. As for the homemade red raspberry jam, it had a sugar concentration of 64.9%, being the most appreciated of all assortments. The homemade raspberry jam was most appreciated due to the taste and aroma of the product. The SCHWARTAU assortment was chosen in second place by the participants in the study (sensory analysis). The quality/price ratio is also valid this time, finding that a high-quality product will have a higher purchase price. Thus, the study had the role of presenting the preferences of the sample participating in the study by age categories.Keywords: red raspberry, jam, antioxidant, colour, sensory analysis
Procedia PDF Downloads 81046 Fight against Money Laundering with Optical Character Recognition
Authors: Saikiran Subbagari, Avinash Malladhi
Abstract:
Anti Money Laundering (AML) regulations are designed to prevent money laundering and terrorist financing activities worldwide. Financial institutions around the world are legally obligated to identify, assess and mitigate the risks associated with money laundering and report any suspicious transactions to governing authorities. With increasing volumes of data to analyze, financial institutions seek to automate their AML processes. In the rise of financial crimes, optical character recognition (OCR), in combination with machine learning (ML) algorithms, serves as a crucial tool for automating AML processes by extracting the data from documents and identifying suspicious transactions. In this paper, we examine the utilization of OCR for AML and delve into various OCR techniques employed in AML processes. These techniques encompass template-based, feature-based, neural network-based, natural language processing (NLP), hidden markov models (HMMs), conditional random fields (CRFs), binarizations, pattern matching and stroke width transform (SWT). We evaluate each technique, discussing their strengths and constraints. Also, we emphasize on how OCR can improve the accuracy of customer identity verification by comparing the extracted text with the office of foreign assets control (OFAC) watchlist. We will also discuss how OCR helps to overcome language barriers in AML compliance. We also address the implementation challenges that OCR-based AML systems may face and offer recommendations for financial institutions based on the data from previous research studies, which illustrate the effectiveness of OCR-based AML.Keywords: anti-money laundering, compliance, financial crimes, fraud detection, machine learning, optical character recognition
Procedia PDF Downloads 1441045 Status of Participative Governance Practices in Higher Education: Implications for Stakeholders' Transformative Role-Assumption
Authors: Endalew Fufa Kufi
Abstract:
The research investigated the role of stakeholders such as students, teachers and administrators in the practices of good governance in higher education by looking into the special contributions of top-officials, teachers and students in ensuring workable ties and productive interchanges in Adama Science and Technology University. Attention was given to participation, fairness and exemplariness as key indicators of good governance. The target university was chosen for its familiarity for the researcher to get dependable data, access to respondent and management of the processing of data. Descriptive survey design was used for the purpose of describing concerned roles the stakeholders in the university governance in order to reflect on the nature of participation of the practices. Centres of the research were administration where supportive groups such as central administrators and underlying service-givers had parts and academia where teachers and students were target. Generally, 60 teachers, 40 students and 15 administrative officers were referents. Data were collected in the form of self-report through open-ended questionnaires. The findings indicated that, while vertical interchanges in terms of academic and administrative routines were had normal flow on top-down basis, planned practices of stakeholders in decision-making and reasonably communicating roles and changes in decisions with top-officials were not efficiently practiced. Moreover, the practices of good modelling were not witnessed to have existed to the fullest extent. Rather, existence of a very wide gap between the academic and administrative staffs was witnessed as was reflected the case between teachers and students. The implication was such that for shortage in participative atmosphere and weaning of fairness in governance, routine practices have been there as the vicious circles of governance.Keywords: governance, participative, stakeholders, transformative, role-assumption
Procedia PDF Downloads 3951044 Fast Switching Mechanism for Multicasting Failure in OpenFlow Networks
Authors: Alaa Allakany, Koji Okamura
Abstract:
Multicast technology is an efficient and scalable technology for data distribution in order to optimize network resources. However, in the IP network, the responsibility for management of multicast groups is distributed among network routers, which causes some limitations such as delays in processing group events, high bandwidth consumption and redundant tree calculation. Software Defined Networking (SDN) represented by OpenFlow presented as a solution for many problems, in SDN the control plane and data plane are separated by shifting the control and management to a remote centralized controller, and the routers are used as a forwarder only. In this paper we will proposed fast switching mechanism for solving the problem of link failure in multicast tree based on Tabu Search heuristic algorithm and modifying the functions of OpenFlow switch to fasts switch to the pack up sub tree rather than sending to the controller. In this work we will implement multicasting OpenFlow controller, this centralized controller is a core part in our multicasting approach, which is responsible for 1- constructing the multicast tree, 2- handling the multicast group events and multicast state maintenance. And finally modifying OpenFlow switch functions for fasts switch to pack up paths. Forwarders, forward the multicast packet based on multicast routing entries which were generated by the centralized controller. Tabu search will be used as heuristic algorithm for construction near optimum multicast tree and maintain multicast tree to still near optimum in case of join or leave any members from multicast group (group events).Keywords: multicast tree, software define networks, tabu search, OpenFlow
Procedia PDF Downloads 2631043 Beneficiation of Pulp and Paper Mill Sludge for the Generation of Single Cell Protein for Fish Farming
Authors: Lucretia Ramnath
Abstract:
Fishmeal is extensively used for fish farming but is an expensive fish feed ingredient. A cheaper alternate to fishmeal is single cell protein (SCP) which can be cultivated on fermentable sugars recovered from organic waste streams such as pulp and paper mill sludge (PPMS). PPMS has a high cellulose content, thus is suitable for glucose recovery through enzymatic hydrolysis but is hampered by lignin and ash. To render PPMS amenable for enzymatic hydrolysis, the PPMS waspre-treated to produce a glucose-rich hydrolysate which served as a feed stock for the production of fungal SCP. The PPMS used in this study had the following composition: 72.77% carbohydrates, 8.6% lignin, and 18.63% ash. The pre-treatments had no significant effect on lignin composition but had a substantial effect on carbohydrate and ash content. Enzymatic hydrolysis of screened PPMS was previously optimized through response surface methodology (RSM) and 2-factorial design. The optimized protocol resulted in a hydrolysate containing 46.1 g/L of glucose, of which 86% was recovered after downstream processing by passing through a 100-mesh sieve (38 µm pore size). Vogel’s medium supplemented with 10 g/L hydrolysate successfully supported the growth of Fusarium venenatum, conducted using standard growth conditions; pH 6, 200 rpm, 2.88 g/L ammonium phosphate, 25°C. A maximum F. venenatum biomass of 45 g/L was produced with a yield coefficient of 4.67. Pulp and paper mill sludge hydrolysate contained approximately five times more glucose than what was needed for SCP production and served as a suitable carbon source. We have shown that PPMS can be successfully beneficiated for SCP production.Keywords: pulp and paper waste, fungi, single cell protein, hydrolysate
Procedia PDF Downloads 2051042 Automated Fact-Checking by Incorporating Contextual Knowledge and Multi-Faceted Search
Authors: Wenbo Wang, Yi-Fang Brook Wu
Abstract:
The spread of misinformation and disinformation has become a major concern, particularly with the rise of social media as a primary source of information for many people. As a means to address this phenomenon, automated fact-checking has emerged as a safeguard against the spread of misinformation and disinformation. Existing fact-checking approaches aim to determine whether a news claim is true or false, and they have achieved decent veracity prediction accuracy. However, the state-of-the-art methods rely on manually verified external information to assist the checking model in making judgments, which requires significant human resources. This study introduces a framework, SAC, which focuses on 1) augmenting the representation of a claim by incorporating additional context using general-purpose, comprehensive, and authoritative data; 2) developing a search function to automatically select relevant, new, and credible references; 3) focusing on the important parts of the representations of a claim and its reference that are most relevant to the fact-checking task. The experimental results demonstrate that 1) Augmenting the representations of claims and references through the use of a knowledge base, combined with the multi-head attention technique, contributes to improved performance of fact-checking. 2) SAC with auto-selected references outperforms existing fact-checking approaches with manual selected references. Future directions of this study include I) exploring knowledge graphs in Wikidata to dynamically augment the representations of claims and references without introducing too much noise, II) exploring semantic relations in claims and references to further enhance fact-checking.Keywords: fact checking, claim verification, deep learning, natural language processing
Procedia PDF Downloads 601041 Experimental Investigation of the Effect of Glass Granulated Blast Furnace Slag on Pavement Quality Concrete Pavement Made of Recycled Asphalt Pavement Material
Authors: Imran Altaf Wasil, Dinesh Ganvir
Abstract:
Due to a scarcity of virgin aggregates, the use of reclaimed asphalt pavement (RAP) as a substitute for natural aggregates has gained popularity. Despite the fact that RAP is recycled in asphalt pavement, there is still excess RAP, and its use in concrete pavements has expanded in recent years. According to a survey, 98 percent of India's pavements are flexible. As a result, the maintenance and reconstruction of such pavements generate RAP, which can be reused in concrete pavements as well as surface course, base course, and sub-base of flexible pavements. Various studies on the properties of reclaimed asphalt pavement and its optimal requirements for usage in concrete has been conducted throughout the years. In this study a total of four different mixes were prepared by partially replacing natural aggregates by RAP in different proportions. It was found that with the increase in the replacement level of Natural aggregates by RAP the mechanical and durability properties got reduced. In order to increase the mechanical strength of mixes 40% Glass Granulated Blast Furnace Slag (GGBS) was used and it was found that with replacement of cement by 40% of GGBS, there was an enhancement in the mechanical and durability properties of RAP inclusive PQC mixes. The reason behind the improvement in the properties is due to the processing technique used in order to remove the contaminant layers present in the coarse RAP aggregates. The replacement level of Natural aggregate with RAP was done in proportions of 20%, 40% and 60% along with the partial replacement of cement by 40% GGBS. It was found that all the mixes surpassed the design target value of 40 MPa in compression and 4.5 MPa in flexure making it much more economical and feasible.Keywords: reclaimed asphalt pavement, pavement quality concrete, glass granulated blast furnace slag, mechanical and durability properties
Procedia PDF Downloads 1121040 Effect of Marketing Strategy on the Performance of Small and Medium Enterprises in Nigeria
Authors: Kadiri Kayode Ibrahim, Kadiri Omowunmi
Abstract:
The research study was concerned with an evaluation of the effect of marketing strategy on the performance of SMEs in Abuja. This was achieved, specifically, through the examination of the effect of disaggregated components of Marketing Strategy (Product, Price, Promotion, Placement and Process) on Sales Volume (as a proxy for performance). The study design was causal in nature, with the use of quantitative methods involving a cross-sectional survey carried out with the administration of a structured questionnaire. A multistage sample of 398 respondents was utilized to provide the primary data used in the study. Subsequently, path analysis was employed in processing the obtained data and testing formulated hypotheses. Findings from the study indicated that all modeled components of marketing strategy were positive and statistically significant determinants of performance among businesses in the zone. It was, therefore, recommended that SMEs invest in continuous product innovation and development that are in line with the needs and preferences of the target market, as well as adopt a dynamic pricing strategy that considers both cost factors and market conditions. It is, therefore, crucial that businesses in the zone adopt marker communication measures that would stimulate brand awareness and increase engagement, including the use of social media platforms and content marketing. Additionally, owner-managers should ensure that their products are readily available to their target customers through an emphasis on availability and accessibility measures. Furthermore, a commitment to consistent optimization of internal operations is crucial for improved productivity, reduced costs, and enhanced customer satisfaction, which in turn will positively impact their overall performance.Keywords: product, price, promotion, placement
Procedia PDF Downloads 391039 StockTwits Sentiment Analysis on Stock Price Prediction
Authors: Min Chen, Rubi Gupta
Abstract:
Understanding and predicting stock market movements is a challenging problem. It is believed stock markets are partially driven by public sentiments, which leads to numerous research efforts to predict stock market trend using public sentiments expressed on social media such as Twitter but with limited success. Recently a microblogging website StockTwits is becoming increasingly popular for users to share their discussions and sentiments about stocks and financial market. In this project, we analyze the text content of StockTwits tweets and extract financial sentiment using text featurization and machine learning algorithms. StockTwits tweets are first pre-processed using techniques including stopword removal, special character removal, and case normalization to remove noise. Features are extracted from these preprocessed tweets through text featurization process using bags of words, N-gram models, TF-IDF (term frequency-inverse document frequency), and latent semantic analysis. Machine learning models are then trained to classify the tweets' sentiment as positive (bullish) or negative (bearish). The correlation between the aggregated daily sentiment and daily stock price movement is then investigated using Pearson’s correlation coefficient. Finally, the sentiment information is applied together with time series stock data to predict stock price movement. The experiments on five companies (Apple, Amazon, General Electric, Microsoft, and Target) in a duration of nine months demonstrate the effectiveness of our study in improving the prediction accuracy.Keywords: machine learning, sentiment analysis, stock price prediction, tweet processing
Procedia PDF Downloads 1561038 Accessibility Assessment of School Facilities Using Geospatial Technologies: A Case Study of District Sheikhupura
Authors: Hira Jabbar
Abstract:
Education is vital for inclusive growth of an economy and a critical contributor for investment in human capital. Like other developing countries, Pakistan is facing enormous challenges regarding the provision of public facilities, improper infrastructure planning, accelerating rate of population and poor accessibility. The influence of the rapid advancement and innovations in GIS and RS techniques have proved to be a useful tool for better planning and decision making to encounter these challenges. Therefore present study incorporates GIS and RS techniques to investigate the spatial distribution of school facilities, identifies settlements with served and unserved population, finds potential areas for new schools based on population and develops an accessibility index to evaluate the higher accessibility for schools. For this purpose high-resolution worldview imagery was used to develop road network, settlements and school facilities and to generate school accessibility for each level. Landsat 8 imagery was utilized to extract built-up area by applying pre and post-processing models and Landscan 2015 was used to analyze population statistics. Service area analysis was performed using network analyst extension in ArcGIS 10.3v and results were evaluated for served and underserved areas and population. An accessibility tool was used to evaluate a set of potential destinations to determine which is the most accessible with the given population distribution. Findings of the study may contribute to facilitating the town planners and education authorities for understanding the existing patterns of school facilities. It is concluded that GIS and remote sensing can be effectively used in urban transport and facility planning.Keywords: accessibility, geographic information system, landscan, worldview
Procedia PDF Downloads 3241037 Marketing Parameters on Consumer's Perceptions of Farmed Sea Bass in Greece
Authors: Sophia Anastasiou, Cosmas Nathanailides, Fotini Kakali, Kostas Karipoglou
Abstract:
Wild fish are considered as testier and in fish restaurants are offered at twice the price of farmed fish. Several chemical and structural differences can affect the consumer's attitudes for farmed fish. The structure and chemical composition of fish muscle is also important for the performance of farmed fish during handling, storage and processing. In the present work we present the chemical and sensory parameters which are used as indicators of fish flesh quality and we investigated the perceptions of consumers for farmed sea bass and the organoleptic differences between samples of wild and farmed sea bass. A questionnaire was distributed to a group of various ages that were regular consumers of sea bass. The questionnaire included a survey on the perceptions on taste and appearance differences between wild and farmed sea bass. A significant percentage (>40%) of the participants stated their perception of superior taste of wild sea bass versus the farmed fish. The participants took part in an organoleptic assessment of wild and farmed sea bass prepared and cooked by a local fish restaurant. Portions were evaluated for intensity of sensorial attributes from 1 (low intensity) to 5 (high intensity). The results indicate that contrary to the assessor's perception, farmed sea bass scored better in al organoleptic parameters assessed with marked superiority in texture and taste over the wild sea bass. This research has been co-financed by the European Union (European Social Fund – ESF) and Greek national funds through the Operational Program "Education and Lifelong Learning" of the National Strategic Reference Framework (NSRF) - Research Funding Program: ARCHIMEDES III. Investing in knowledge society through the European Social Fund.Keywords: fish marketing, farmed fish, seafood quality, wild fish
Procedia PDF Downloads 4001036 A Two Server Poisson Queue Operating under FCFS Discipline with an ‘m’ Policy
Authors: R. Sivasamy, G. Paulraj, S. Kalaimani, N.Thillaigovindan
Abstract:
For profitable businesses, queues are double-edged swords and hence the pain of long wait times in a queue often frustrates customers. This paper suggests a technical way of reducing the pain of lines through a Poisson M/M1, M2/2 queueing system operated by two heterogeneous servers with an objective of minimising the mean sojourn time of customers served under the queue discipline ‘First Come First Served with an ‘m’ policy, i.e. FCFS-m policy’. Arrivals to the system form a Poisson process of rate λ and are served by two exponential servers. The service times of successive customers at server ‘j’ are independent and identically distributed (i.i.d.) random variables and each of it is exponentially distributed with rate parameter μj (j=1, 2). The primary condition for implementing the queue discipline ‘FCFS-m policy’ on these service rates μj (j=1, 2) is that either (m+1) µ2 > µ1> m µ2 or (m+1) µ1 > µ2> m µ1 must be satisfied. Further waiting customers prefer the server-1 whenever it becomes available for service, and the server-2 should be installed if and only if the queue length exceeds the value ‘m’ as a threshold. Steady-state results on queue length and waiting time distributions have been obtained. A simple way of tracing the optimal service rate μ*2 of the server-2 is illustrated in a specific numerical exercise to equalize the average queue length cost with that of the service cost. Assuming that the server-1 has to dynamically adjust the service rates as μ1 during the system size is strictly less than T=(m+2) while μ2=0, and as μ1 +μ2 where μ2>0 if the system size is more than or equal to T, corresponding steady state results of M/M1+M2/1 queues have been deduced from those of M/M1,M2/2 queues. To conclude this investigation has a viable application, results of M/M1+M2/1 queues have been used in processing of those waiting messages into a single computer node and to measure the power consumption by the node.Keywords: two heterogeneous servers, M/M1, M2/2 queue, service cost and queue length cost, M/M1+M2/1 queue
Procedia PDF Downloads 3611035 Innovative Waste Management Practices in Remote Areas
Authors: Dolores Hidalgo, Jesús M. Martín-Marroquín, Francisco Corona
Abstract:
Municipal waste consist of a variety of items that are everyday discarded by the population. They are usually collected by municipalities and include waste generated by households, commercial activities (local shops) and public buildings. The composition of municipal waste varies greatly from place to place, being mostly related to levels and patterns of consumption, rates of urbanization, lifestyles, and local or national waste management practices. Each year, a huge amount of resources is consumed in the EU, and according to that, also a huge amount of waste is produced. The environmental problems derived from the management and processing of these waste streams are well known, and include impacts on land, water and air. The situation in remote areas is even worst. Difficult access when climatic conditions are adverse, remoteness of centralized municipal treatment systems or dispersion of the population, are all factors that make remote areas a real municipal waste treatment challenge. Furthermore, the scope of the problem increases significantly because the total lack of awareness of the existing risks in this area together with the poor implementation of advanced culture on waste minimization and recycling responsibly. The aim of this work is to analyze the existing situation in remote areas in reference to the production of municipal waste and evaluate the efficiency of different management alternatives. Ideas for improving waste management in remote areas include, for example: the implementation of self-management systems for the organic fraction; establish door-to-door collection models; promote small-scale treatment facilities or adjust the rates of waste generation thereof.Keywords: door to door collection, islands, isolated areas, municipal waste, remote areas, rural communities
Procedia PDF Downloads 2591034 Efficiency of PCR-RFLP for the Identification of Adulteries in Meat Formulation
Authors: Hela Gargouri, Nizar Moalla, Hassen Hadj Kacem
Abstract:
Meat adulteration affecting the safety and quality of food is becoming one of the main concerns of public interest across the world. The drastic consequences on the meat industry highlighted the urgent necessity to control the products' quality and to point out the complexity of both supply and processing circuits. Due to the expansion of this problem, the authentic testing of foods, particularly meat and its products, is deemed crucial to avoid unfair market competition and to protect consumers from fraudulent practices of meat adulteration. The adoption of authentication methods by the food quality-control laboratories is becoming a priority issue. However, in some developing countries, the number of food tests is still insignificant, although a variety of processed and traditional meat products are widely consumed. Little attention has been paid to provide an easy, fast, reproducible, and low-cost molecular test, which could be conducted in a basic laboratory. In the current study, the 359 bp fragment of the cytochrome-b gene was mapped by PCR-RFLP using firstly fresh biological supports (DNA and meat) and then turkey salami as an example of commercial processed meat. This technique has been established through several optimizations, namely: the selection of restriction enzymes. The digestion with BsmAI, SspI, and TaaI succeed to identify the seven included animal species when meat is formed by individual species and when the meat is a mixture of different origin. In this study, the PCR-RFLP technique using universal primer succeed to meet our needs by providing an indirect sequencing method identifying by restriction enzymes the specificities characterizing different species on the same amplicon reducing the number of potential tests.Keywords: adulteration, animal species, authentication, meat, mtDNA, PCR-RFLP
Procedia PDF Downloads 1111033 Designing Agricultural Irrigation Systems Using Drone Technology and Geospatial Analysis
Authors: Yongqin Zhang, John Lett
Abstract:
Geospatial technologies have been increasingly used in agriculture for various applications and purposes in recent years. Unmanned aerial vehicles (drones) fit the needs of farmers in farming operations, from field spraying to grow cycles and crop health. In this research, we conducted a practical research project that used drone technology to design and map optimal locations and layouts of irrigation systems for agriculture farms. We flew a DJI Mavic 2 Pro drone to acquire aerial remote sensing images over two agriculture fields in Forest, Mississippi, in 2022. Flight plans were first designed to capture multiple high-resolution images via a 20-megapixel RGB camera mounted on the drone over the agriculture fields. The Drone Deploy web application was then utilized to develop flight plans and subsequent image processing and measurements. The images were orthorectified and processed to estimate the area of the area and measure the locations of the water line and sprinkle heads. Field measurements were conducted to measure the ground targets and validate the aerial measurements. Geospatial analysis and photogrammetric measurements were performed for the study area to determine optimal layout and quantitative estimates for irrigation systems. We created maps and tabular estimates to demonstrate the locations, spacing, amount, and layout of sprinkler heads and water lines to cover the agricultural fields. This research project provides scientific guidance to Mississippi farmers for a precision agricultural irrigation practice.Keywords: drone images, agriculture, irrigation, geospatial analysis, photogrammetric measurements
Procedia PDF Downloads 731032 Application of Remote Sensing for Monitoring the Impact of Lapindo Mud Sedimentation for Mangrove Ecosystem, Case Study in Sidoarjo, East Java
Authors: Akbar Cahyadhi Pratama Putra, Tantri Utami Widhaningtyas, M. Randy Aswin
Abstract:
Indonesia as an archipelagic nation have very long coastline which have large potential marine resources, one of that is the mangrove ecosystems. Lapindo mudflow disaster in Sidoarjo, East Java requires mudflow flowed into the sea through the river Brantas and Porong. Mud material that transported by river flow is feared dangerous because they contain harmful substances such as heavy metals. This study aims to map the mangrove ecosystem seen from its density and knowing how big the impact of a disaster on the Lapindo mud to mangrove ecosystem and accompanied by efforts to address the mangrove ecosystem that maintained continuity. Mapping coastal mangrove conditions of Sidoarjo was done using remote sensing products that Landsat 7 ETM + images with dry months of recording time in 2002, 2006, 2009, and 2014. The density of mangrove detected using NDVI that uses the band 3 that is the red channel and band 4 that is near IR channel. Image processing was used to produce NDVI using ENVI 5.1 software. NDVI results were used for the detection of mangrove density is 0-1. The development of mangrove ecosystems of both area and density from year to year experienced has a significant increase. Mangrove ecosystems growths are affected by material deposition area of Lapindo mud on Porong and Brantas river estuary, where the silt is growing medium suitable mangrove ecosystem and increasingly growing. Increasing the density caused support by public awareness to prevent heavy metals in the material so that the Lapindo mud mangrove breeding done around the farm.Keywords: archipelagic nation, mangrove, Lapindo mudflow disaster, NDVI
Procedia PDF Downloads 4371031 Lamb Waves Wireless Communication in Healthy Plates Using Coherent Demodulation
Authors: Rudy Bahouth, Farouk Benmeddour, Emmanuel Moulin, Jamal Assaad
Abstract:
Guided ultrasonic waves are used in Non-Destructive Testing (NDT) and Structural Health Monitoring (SHM) for inspection and damage detection. Recently, wireless data transmission using ultrasonic waves in solid metallic channels has gained popularity in some industrial applications such as nuclear, aerospace and smart vehicles. The idea is to find a good substitute for electromagnetic waves since they are highly attenuated near metallic components due to Faraday shielding. The proposed solution is to use ultrasonic guided waves such as Lamb waves as an information carrier due to their capability of propagation for long distances. In addition to this, valuable information about the health of the structure could be extracted simultaneously. In this work, the reliable frequency bandwidth for communication is extracted experimentally from dispersion curves at first. Then, an experimental platform for wireless communication using Lamb waves is described and built. After this, coherent demodulation algorithm used in telecommunications is tested for Amplitude Shift Keying, On-Off Keying and Binary Phase Shift Keying modulation techniques. Signal processing parameters such as threshold choice, number of cycles per bit and Bit Rate are optimized. Experimental results are compared based on the average Bit Error Rate. Results have shown high sensitivity to threshold selection for Amplitude Shift Keying and On-Off Keying techniques resulting a Bit Rate decrease. Binary Phase Shift Keying technique shows the highest stability and data rate between all tested modulation techniques.Keywords: lamb waves communication, wireless communication, coherent demodulation, bit error rate
Procedia PDF Downloads 2591030 Corrosion Analysis and Interfacial Characterization of Al – Steel Metal Inert Gas Weld - Braze Dissimilar Joints by Micro Area X-Ray Diffraction Technique
Authors: S. S. Sravanthi, Swati Ghosh Acharyya
Abstract:
Automotive light weighting is of major prominence in the current times due to its contribution in improved fuel economy and reduced environmental pollution. Various arc welding technologies are being employed in the production of automobile components with reduced weight. The present study is of practical importance since it involves preferential substitution of Zinc coated mild steel with a light weight alloy such as 6061 Aluminium by means of Gas Metal Arc Welding (GMAW) – Brazing technique at different processing parameters. However, the fabricated joints have shown the generation of Al – Fe layer at the interfacial regions which was confirmed by the Scanning Electron Microscope and Energy Dispersion Spectroscopy. These Al-Fe compounds not only affect the mechanical strength, but also predominantly deteriorate the corrosion resistance of the joints. Hence, it is essential to understand the phases formed in this layer and their crystal structure. Micro area X - ray diffraction technique has been exclusively used for this study. Moreover, the crevice corrosion analysis at the joint interfaces was done by exposing the joints to 5 wt.% FeCl3 solution at regular time intervals as per ASTM G 48-03. The joints have shown a decreased crevice corrosion resistance with increased heat intensity. Inner surfaces of welds have shown severe oxide cracking and a remarkable weight loss when exposed to concentrated FeCl3. The weight loss was enhanced with decreased filler wire feed rate and increased heat intensity.Keywords: automobiles, welding, corrosion, lap joints, Micro XRD
Procedia PDF Downloads 1221029 A Phenomenological Approach to Computational Modeling of Analogy
Authors: José Eduardo García-Mendiola
Abstract:
In this work, a phenomenological approach to computational modeling of analogy processing is carried out. The paper goes through the consideration of the structure of the analogy, based on the possibility of sustaining the genesis of its elements regarding Husserl's genetic theory of association. Among particular processes which take place in order to get analogical inferences, there is one which arises crucial for enabling efficient base cases retrieval through long-term memory, namely analogical transference grounded on familiarity. In general, it has been argued that analogical reasoning is a way by which a conscious agent tries to determine or define a certain scope of objects and relationships between them using previous knowledge of other familiar domain of objects and relations. However, looking for a complete description of analogy process, a deeper consideration of phenomenological nature is required in so far, its simulation by computational programs is aimed. Also, one would get an idea of how complex it would be to have a fully computational account of the analogy elements. In fact, familiarity is not a result of a mere chain of repetitions of objects or events but generated insofar as the object/attribute or event in question is integrable inside a certain context that is taking shape as functionalities and functional approaches or perspectives of the object are being defined. Its familiarity is generated not by the identification of its parts or objective determinations as if they were isolated from those functionalities and approaches. Rather, at the core of such a familiarity between entities of different kinds lays the way they are functionally encoded. So, and hoping to make deeper inroads towards these topics, this essay allows us to consider that cognitive-computational perspectives can visualize, from the phenomenological projection of the analogy process reviewing achievements already obtained as well as exploration of new theoretical-experimental configurations towards implementation of analogy models in specific as well as in general purpose machines.Keywords: analogy, association, encoding, retrieval
Procedia PDF Downloads 1211028 A Real-Time Moving Object Detection and Tracking Scheme and Its Implementation for Video Surveillance System
Authors: Mulugeta K. Tefera, Xiaolong Yang, Jian Liu
Abstract:
Detection and tracking of moving objects are very important in many application contexts such as detection and recognition of people, visual surveillance and automatic generation of video effect and so on. However, the task of detecting a real shape of an object in motion becomes tricky due to various challenges like dynamic scene changes, presence of shadow, and illumination variations due to light switch. For such systems, once the moving object is detected, tracking is also a crucial step for those applications that used in military defense, video surveillance, human computer interaction, and medical diagnostics as well as in commercial fields such as video games. In this paper, an object presents in dynamic background is detected using adaptive mixture of Gaussian based analysis of the video sequences. Then the detected moving object is tracked using the region based moving object tracking and inter-frame differential mechanisms to address the partial overlapping and occlusion problems. Firstly, the detection algorithm effectively detects and extracts the moving object target by enhancing and post processing morphological operations. Secondly, the extracted object uses region based moving object tracking and inter-frame difference to improve the tracking speed of real-time moving objects in different video frames. Finally, the plotting method was applied to detect the moving objects effectively and describes the object’s motion being tracked. The experiment has been performed on image sequences acquired both indoor and outdoor environments and one stationary and web camera has been used.Keywords: background modeling, Gaussian mixture model, inter-frame difference, object detection and tracking, video surveillance
Procedia PDF Downloads 4751027 Physicochemical Properties and Thermal Inactivation of Polyphenol Oxidase of African Bush Mango (Irvingia Gabonensis) Fruit
Authors: Catherine Joke Adeseko
Abstract:
Enzymatic browning is an economically important disorder that degrades organoleptic properties and prevent the consumer from purchasing fresh fruit and vegetables. Prevention and control of enzymatic browning in fruit and its product is imperative. Therefore, this study sought to investigate the catalytic effect of polyphenol oxidase (PPO) in the adverse browning of African bush mango (Irvingia gabonensis) fruit peel and pulp. PPO was isolated and purified, and its physicochemical properties, such as the effect of pH with SDS, temperature, and thermodynamic studies, which invariably led to thermal inactivation of purified PPO at 80 °C, were evaluated. The pH and temperature optima of PPO were found at 7.0 and 50, respectively. There was a gradual increase in the activity of PPO as the pH increases. However, the enzyme exhibited a higher activity at neutral pH 7.0, while enzymatic inhibition was observed at acidic region, pH 2.0. The presence of SDS at pH 5.0 downward was found to inhibit the activity of PPO from the peel and pulp of I. gabonensis. The average value of enthalpy (ΔH), entropy (ΔS), and Gibbs free energy (ΔG) obtained at 20 min of incubation and temperature 30 – 80 °C were respectively 39.93 kJ.mol-1, 431.57 J.mol-1 .K-1 and -107.99 kJ.mol-1 for peel PPO, and 37.92 kJ.mol-1, -442.51J.mol-1.K-1, and -107.22 kJ.mol-1 for pulp PPO. Thermal inactivation of PPO from I. gabonensis exhibited a reduction in catalytic activity as the temperature and duration of heat inactivation increases using catechol, reflected by an increment in k value. The half-life of PPO (t1/2) decreases as the incubation temperature increases due to the instability of the enzyme at high temperatures and was higher in pulp than peel. Both D and Z values decrease with increase in temperature. The information from this study suggests processing parameters for controlling PPO in the potential industrial application of I. gabonensis fruit in order to prolong the shelf-life of this fruit for maximum utilization.Keywords: enzymatic, browning, characterization, activity
Procedia PDF Downloads 88