Search results for: features comparison
3269 Recent Developments in the Application of Deep Learning to Stock Market Prediction
Authors: Shraddha Jain Sharma, Ratnalata Gupta
Abstract:
Predicting stock movements in the financial market is both difficult and rewarding. Analysts and academics are increasingly using advanced approaches such as machine learning techniques to anticipate stock price patterns, thanks to the expanding capacity of computing and the recent advent of graphics processing units and tensor processing units. Stock market prediction is a type of time series prediction that is incredibly difficult to do since stock prices are influenced by a variety of financial, socioeconomic, and political factors. Furthermore, even minor mistakes in stock market price forecasts can result in significant losses for companies that employ the findings of stock market price prediction for financial analysis and investment. Soft computing techniques are increasingly being employed for stock market prediction due to their better accuracy than traditional statistical methodologies. The proposed research looks at the need for soft computing techniques in stock market prediction, the numerous soft computing approaches that are important to the field, past work in the area with their prominent features, and the significant problems or issue domain that the area involves. For constructing a predictive model, the major focus is on neural networks and fuzzy logic. The stock market is extremely unpredictable, and it is unquestionably tough to correctly predict based on certain characteristics. This study provides a complete overview of the numerous strategies investigated for high accuracy prediction, with a focus on the most important characteristics.Keywords: stock market prediction, artificial intelligence, artificial neural networks, fuzzy logic, accuracy, deep learning, machine learning, stock price, trading volume
Procedia PDF Downloads 953268 The Two Layers of Food Safety and GMOs in the Hungarian Agricultural Law
Authors: Gergely Horváth
Abstract:
The study presents the complexity of food safety dividing it into two layers. Beyond the basic layer of requirements, there is a more demanding higher level linked with quality and purity aspects. It would be important to give special prominence to both layers, given that massive illnesses are caused by foods even though officially licensed. Then the study discusses an exciting safety challenge stemming from the risks of genetically modified organisms (GMOs). Furthermore, it features legal case examples that illustrate how certain liability questions are solved or not yet decided in connection with the production of genetically modified crops. In addition, a special kind of land grabbing, more precisely land grabbing from non-GMO farming systems can also be noticed as well as a new phenomenon eroding food sovereignty. Coexistence, the state where organic, conventional, and GM farming systems are standing alongside each other is an unsuitable experiment that cannot be successful, because of biophysical reasons (such as cross-pollination). Agricultural and environmental lawyers both try to find the optimal solution. Agri-environmental measures are introduced as a special subfield of law maintaining also food safety. The important steps of agri-environmental legislation are aiming at the protection of natural values, the environmental media and strengthening food safety as well, practically the quality of agricultural products intended for human consumption. The major findings of the study focus on searching for the appropriate approach capable of solving the security and safety problems of food production. The most interesting concepts of the Hungarian national and EU food law legislation are analyzed in more detail with descriptive, analytic and comparative methods.Keywords: food law, food safety, food security, GMO, Genetically Modified Organisms, agri-environmental measures
Procedia PDF Downloads 4433267 Remote Sensing through Deep Neural Networks for Satellite Image Classification
Authors: Teja Sai Puligadda
Abstract:
Satellite images in detail can serve an important role in the geographic study. Quantitative and qualitative information provided by the satellite and remote sensing images minimizes the complexity of work and time. Data/images are captured at regular intervals by satellite remote sensing systems, and the amount of data collected is often enormous, and it expands rapidly as technology develops. Interpreting remote sensing images, geographic data mining, and researching distinct vegetation types such as agricultural and forests are all part of satellite image categorization. One of the biggest challenge data scientists faces while classifying satellite images is finding the best suitable classification algorithms based on the available that could able to classify images with utmost accuracy. In order to categorize satellite images, which is difficult due to the sheer volume of data, many academics are turning to deep learning machine algorithms. As, the CNN algorithm gives high accuracy in image recognition problems and automatically detects the important features without any human supervision and the ANN algorithm stores information on the entire network (Abhishek Gupta., 2020), these two deep learning algorithms have been used for satellite image classification. This project focuses on remote sensing through Deep Neural Networks i.e., ANN and CNN with Deep Sat (SAT-4) Airborne dataset for classifying images. Thus, in this project of classifying satellite images, the algorithms ANN and CNN are implemented, evaluated & compared and the performance is analyzed through evaluation metrics such as Accuracy and Loss. Additionally, the Neural Network algorithm which gives the lowest bias and lowest variance in solving multi-class satellite image classification is analyzed.Keywords: artificial neural network, convolutional neural network, remote sensing, accuracy, loss
Procedia PDF Downloads 1653266 Comparative Evaluation of Pharmacologically Guided Approaches (PGA) to Determine Maximum Recommended Starting Dose (MRSD) of Monoclonal Antibodies for First Clinical Trial
Authors: Ibraheem Husain, Abul Kalam Najmi, Karishma Chester
Abstract:
First-in-human (FIH) studies are a critical step in clinical development of any molecule that has shown therapeutic promise in preclinical evaluations, since preclinical research and safety studies into clinical development is a crucial step for successful development of monoclonal antibodies for guidance in pharmaceutical industry for the treatment of human diseases. Therefore, comparison between USFDA and nine pharmacologically guided approaches (PGA) (simple allometry, maximum life span potential, brain weight, rule of exponent (ROE), two species methods and one species methods) were made to determine maximum recommended starting dose (MRSD) for first in human clinical trials using four drugs namely Denosumab, Bevacizumab, Anakinra and Omalizumab. In our study, the predicted pharmacokinetic (pk) parameters and the estimated first-in-human dose of antibodies were compared with the observed human values. The study indicated that the clearance and volume of distribution of antibodies can be predicted with reasonable accuracy in human and a good estimate of first human dose can be obtained from the predicted human clearance and volume of distribution. A pictorial method evaluation chart was also developed based on fold errors for simultaneous evaluation of various methods.Keywords: clinical pharmacology (CPH), clinical research (CRE), clinical trials (CTR), maximum recommended starting dose (MRSD), clearance and volume of distribution
Procedia PDF Downloads 3783265 Effect of Post and Pre Induced Treatment with Hesperidin in N-Methyl N-Nitrosourea Induced Mammary Gland Cancer in Female Sprague-Dawley Rats
Authors: Vinay Kumar Theendra
Abstract:
The main objective of the study is to evaluate the effectiveness of hesperidin in the treatment of breast cancer and causing less (or) no bone marrow depression which is the major side effect of the present anticancer drugs treating breast cancer, also to evaluate the mechanisms through which these compounds are exerting their effect. Breast cancer is induced by administering N-methyl N-Nitrosourea (MNU) at a dose of 50mg/kg body weight. Upon the termination of the experiment, the animals were sacrificed by the method of cervical dislocation. The animals were dissected along the ventral midline and were grossly examined for the presence of tumors. Then the tumours were removed along with the stroma. Vascular endothelial growth factor (VEGF) levels were estimated by using ELISA method. The first occurrence of palpable tumors was eight weeks after carcinogen treatment and the final tumour incidence was 100% in the MNU alone and topical treated rats. Whereas in rats of other treatment groups there is decreased tumour incidence which might be due to their antitumour activity. Hesperidin therapy inhibited angiogenesis which can be evident from the significant reduction in serum as well as tumour VEGF concentrations in comparison to the untreated mammary carcinoma bearing rats. Hesperidin is promising agents that exert direct antitumor and also antiangiogenic, antiproliferative and anti-inflammatory activities. Even though the potency is little lesser than standard drug vincristine, it has been proved to be safe without effecting haematological count.Keywords: hesperidin, VEGF, COX 2, N-methyl N-nitrosourea
Procedia PDF Downloads 1413264 Modeling of Combustion Process in the Piston Aircraft Engine Using a MCFM-3Z Model
Authors: Marcin Szlachetka, Konrad Pietrykowski
Abstract:
Modeling of a combustion process in a 9-cylinder aircraft engine is presented. The simulations of the combustion process in the IC engine have provided the information on the spatial and time distributions of selected quantities within the combustion chamber of the engine. The numerical analysis results have been compared with the results of indication process of the engine on the test stand. Modeling of combustion process an auto-ignited IC engine in the AVL Fire was carried out within the study. For the calculations, a ECFM-3Z model was used. Verification of simulation results was carried out by comparison of the pressure in the cylinder. The courses of indicated pressure, obtained from the simulations and during the engine tests mounted on a test stand were compared. The engine was braked by the propeller, which results in an adequate external power characteristics. The test object is a modified ASz-62IR engine with the injection system. The engine was running at take-off power. To check the optimum ignition timing regarding power, calculations, tests were performed for 7 different moments of ignition. The analyses of temperature distribution in the cylinder depending on the moments of ignition were carried out. Additional the course of pressure in the cylinder at different angles of ignition delays of the second spark plug were examined. The swirling of the mixture in the combustion chamber was also analysed. It has been shown that the largest vortexes occur in the middle of the chamber, and gets smaller, closer to the combustion chamber walls. This work has been financed by the Polish National Centre for Research and Development, INNOLOT, under Grant Agreement No. INNOLOT/I/1/NCBR/2013.Keywords: CFD, combustion, internal combustion engine, aircraft engine
Procedia PDF Downloads 3743263 Modeling Local Warming Trend: An Application of Remote Sensing Technique
Authors: Khan R. Rahaman, Quazi K. Hassan
Abstract:
Global changes in climate, environment, economies, populations, governments, institutions, and cultures converge in localities. Changes at a local scale, in turn, contribute to global changes as well as being affected by them. Our hypothesis is built on a consideration that temperature does vary at local level (i.e., termed as local warming) in comparison to the predicted models at the regional and/or global scale. To date, the bulk of the research relating local places to global climate change has been top-down, from the global toward the local, concentrating on methods of impact analysis that use as a starting point climate change scenarios derived from global models, even though these have little regional or local specificity. Thus, our focus is to understand such trends over the southern Alberta, which will enable decision makers, scientists, researcher community, and local people to adapt their policies based on local level temperature variations and to act accordingly. Specific objectives in this study are: (i) to understand the local warming (temperature in particular) trend in context of temperature normal during the period 1961-2010 at point locations using meteorological data; (ii) to validate the data by using specific yearly data, and (iii) to delineate the spatial extent of the local warming trends and understanding influential factors to adopt situation by local governments. Existing data has brought the evidence of such changes and future research emphasis will be given to validate this hypothesis based on remotely sensed data (i.e. MODIS product by NASA).Keywords: local warming, climate change, urban area, Alberta, Canada
Procedia PDF Downloads 3503262 Comparative Study of the Effect of Three Fungicides: Tilt and Artea Amistarxtra about Growing Wheat, Hard, and Soft and Their Impact on Grain Yield and Its Components in the Semi-Arid Zone of Setif
Authors: Cheniti Khalissa, Dekhili Mohamed
Abstract:
Several fungal diseases may infect hard and soft wheat, which directly affect the yield and thus the economy of the homeland. So, a treatment fungicide is one of means of diseases control. In this context, we studied two varieties of wheat; Waha for soft wheat and Hidhab for hard wheat, at the level of the Technical Institute of crops (ITGC) in the wilaya of Setif under semi-arid conditions. This study consists of a successive application of three fungicides (Tilt, Artea, and Armistarxtra) according to three treatments (T1, T2, and T3) in addition to the witness (T0) at different stages of plant development (respectively, Montaison, earing and after flowering) whose purpose is to test and determine the effectiveness of these products used sequentially. The study showed good efficacy when we use the sum of these pesticides The comparison between these different treatments indicates that the T3 treatment reduced yield losses significantly; which is evident in the main yield components such as fertility, grain yield and weight of 1000 grains. The various components of yield and final yield are all parameters to be taken into account in such a study. In general, the fungal treatment is an effective way of improving profitability. In general, the fungal treatment is an effective way of improving profitability and positioning interventions in time is one of the requirements for an appreciable efficiency.Keywords: hard wheat, soft wheat, diseases, fungicide treatment, fertility, 1000-grain weight, semi-arid zone
Procedia PDF Downloads 4063261 Aerodynamic Investigation of Baseline-IV Bird-Inspired BWB Aircraft Design: Improvements over Baseline-III BWB
Authors: C. M. Nur Syazwani, M. K. Ahmad Imran, Rizal E. M. Nasir
Abstract:
The study on BWB UV begins in UiTM since 2005 and three designs have been studied and published. The latest designs are Baseline-III and inspired by birds that have features and aerodynamics behaviour of cruising birds without flapping capability. The aircraft featuring planform and configuration are similar to the bird. Baseline-III has major flaws particularly in its low lift-to-drag ratio, stability and issues regarding limited controllability. New design known as Baseline-IV replaces straight, swept wing to delta wing and have a broader tail compares to the Baseline-III’s. The objective of the study is to investigate aerodynamics of Baseline-IV bird-inspired BWB aircraft. This will be achieved by theoretical calculation and wind tunnel experiments. The result shows that both theoretical and wind tunnel experiments of Baseline-IV graph of CL and CD versus alpha are quite similar to each other in term of pattern of graph slopes and values. Baseline-IV has higher lift coefficient values at wide range of angle of attack compares to Baseline-III. Baseline-IV also has higher maximum lift coefficient, higher maximum lift-to-drag and lower parasite drag. It has stable pitch moment versus lift slope but negative moment at zero lift for zero angle-of-attack tail setting. At high angle of attack, Baseline-IV does not have stability reversal as shown in Baseline-III. Baseline-IV is proven to have improvements over Baseline-III in terms of lift, lift-to-drag ratio and pitch moment stability at high angle-of-attack.Keywords: blended wing-body, bird-inspired blended wing-body, aerodynamic, stability
Procedia PDF Downloads 5113260 Comparison of Corneal Curvature Measurements Conducted with Tomey AO-2000® and the Current Standard Biometer IOL Master®
Authors: Mohd Radzi Hilmi, Khairidzan Mohd Kamal, Che Azemin Mohd Zulfaezal, Ariffin Azrin Esmady
Abstract:
Purpose: Corneal curvature (CC) is an important anterior segment parameter. This study compared CC measurements conducted with two optical devices in phakic eyes. Methods: Sixty phakic eyes of 30 patients were enrolled in this study. CC was measured three times with the optical biometer and topography-keratometer Tomey AO-2000 (Tomey Corporation, Nagoya, Japan), then with the standard partial optical coherence interferometry (PCI) IOL Master (Carl Zeiss Meditec, Dublin, CA) and data were statistically analysed. Results: The measurements resulted in a mean CC of 43.86 ± 1.57 D with Tomey AO-2000 and 43.84 ± 1.55 D with IOL Master. Distribution of data is normal, and no significance difference in CC values was detected (P = 0.952) between the two devices. Correlation between CC measurements was highly significant (r = 0. 99; P < 0.0001). The mean difference of CC values between devices was 0.017 D and 95% limit of agreement was -0.088 to 0.12. Duration taken for measurements with the standard biometer IOL Master was longer (55.17 ± 2.24 seconds) than with Tomey AO-2000 (39.88 ± 2.38 seconds) in automatic mode. Duration of manual measurement with Tomey AO-2000 in manual mode was the shortest (28.57 ± 2.71 seconds). Conclusion: In phakic eyes, CC measured with Tomey AO-2000 and IOL Master showed similar values, and high correlation was observed between these two devices. This shows that both devices can be used interchangeably. Tomey AO-2000 is better in terms of faster to operate and has its own topography systems.Keywords: corneal topography, corneal curvature, IOL Master, Tomey AO2000
Procedia PDF Downloads 3903259 Agile Supply Chains and Its Dependency on Air Transport Mode: A Case Study in Amazon
Authors: Fabiana Lucena Oliveira, Aristides da Rocha Oliveira Junior
Abstract:
This article discusses the dependence on air transport mode of agile supply chains. The agile supply chains are the result of the analysis of the uncertainty supply chain model, which ranks the supply chain, according to the respective product. Thus, understanding the Uncertainty Model and life cycle of products considered standard and innovative is critical to understanding these. The innovative character in the intersection of supply chains arising from the uncertainty model with its most appropriate transport mode. Consider here the variables availability, security and freight as determinants for choosing these modes. Therefore, the research problem is: How agile supply chains maintains logistics competitiveness, as these are dependent on air transport mode? A case study in Manaus Industrial Pole (MIP), an agglomeration model that includes six hundred industries from different backgrounds and billings, located in the Brazilian Amazon. The sample of companies surveyed include those companies whose products are classified in agile supply chains , as innovative and therefore live with the variable uncertainty in the demand for inputs or the supply of finished products. The results confirm the hypothesis that the dependency level of air transport mode is greater than fifty percent. It follows then, that maintain agile supply chain away from suppliers base is expensive (1) , and continuity analysis needs to be remade on each twenty four months (2) , consider that additional freight, handling and storage as members of the logistics costs (3) , and the comparison with the upcoming agile supply chains the world need to consider the location effect (4).Keywords: uncertainty model, air transport mode, competitiveness, logistics
Procedia PDF Downloads 5153258 Sphingosomes: Potential Anti-Cancer Vectors for the Delivery of Doxorubicin
Authors: Brajesh Tiwari, Yuvraj Dangi, Abhishek Jain, Ashok Jain
Abstract:
The purpose of the investigation was to evaluate the potential of sphingosomes as nanoscale drug delivery units for site-specific delivery of anti-cancer agents. Doxorubicin Hydrochloride (DOX) was selected as a model anti-cancer agent. Sphingosomes were prepared and loaded with DOX and optimized for size and drug loading. The formulations were characterized by Malvern zeta-seizer and Transmission Electron Microscopy (TEM) studies. Sphingosomal formulations were further evaluated for in-vitro drug release study under various pH profiles. The in-vitro drug release study showed an initial rapid release of the drug followed by a slow controlled release. In vivo studies of optimized formulations and free drug were performed on albino rats for comparison of drug plasma concentration. The in- vivo study revealed that the prepared system enabled DOX to have had enhanced circulation time, longer half-life and lower elimination rate kinetics as compared to free drug. Further, it can be interpreted that the formulation would selectively enter highly porous mass of tumor cells and at the same time spare normal tissues. To summarize, the use of sphingosomes as carriers of anti-cancer drugs may prove to be a fascinating approach that would selectively localize in the tumor mass, increasing the therapeutic margin of safety while reducing the side effects associated with anti-cancer agents.Keywords: sphingosomes, anti-cancer, doxorubicin, formulation
Procedia PDF Downloads 3063257 Preserving Urban Cultural Heritage with Deep Learning: Color Planning for Japanese Merchant Towns
Authors: Dongqi Li, Yunjia Huang, Tomo Inoue, Kohei Inoue
Abstract:
With urbanization, urban cultural heritage is facing the impact and destruction of modernization and urbanization. Many historical areas are losing their historical information and regional cultural characteristics, so it is necessary to carry out systematic color planning for historical areas in conservation. As an early focus on urban color planning, Japan has a systematic approach to urban color planning. Hence, this paper selects five merchant towns from the category of important traditional building preservation areas in Japan as the subject of this study to explore the color structure and emotion of this type of historic area. First, the image semantic segmentation method identifies the buildings, roads, and landscape environments. Their color data were extracted for color composition and emotion analysis to summarize their common features. Second, the obtained Internet evaluations were extracted by natural language processing for keyword extraction. The correlation analysis of the color structure and keywords provides a valuable reference for conservation decisions for this historic area in the town. This paper also combines the color structure and Internet evaluation results with generative adversarial networks to generate predicted images of color structure improvements and color improvement schemes. The methods and conclusions of this paper can provide new ideas for the digital management of environmental colors in historic districts and provide a valuable reference for the inheritance of local traditional culture.Keywords: historic districts, color planning, semantic segmentation, natural language processing
Procedia PDF Downloads 913256 The Influence of Migration on Migrants' Culture: A Study on Egyptian Nubians' Migration to Investigate Culture Changes
Authors: Tarek Hassan, Sanaa Abouras
Abstract:
Some factors such as interaction of migration process, cultural identity have an impact in a way that may lead to cultural disinheritance. Even if migrants' culture would not be lost, it may be affected by the new society culture. Therefore, it is anticipated that migration of an ethnic group would impact the culture of that group. Nubians; an ethnic group originated in South Egypt, have experienced migration that took place in the sixties of the past century. Nubians were forced to leave their origin land and relocate to Kom Ombo; an Egyptian town to the north of Aswan. The effect of migration on national culture, social homogeneity or the interest of social contact influences the attitudes of natives towards migration. Hence, it is very important for societies to help migrants to adapt to the new culture and at the same time not to impede migrants' effort to maintain their own culture. This study aims to investigate the effect of internal migration on the culture of Egyptian Nubians in order to predict if Nubian can maintain their own culture after the migration. Research question: what is the cultural influence of Nubians' migration from Egyptian Nubia to their new destinations? The researchers' hypothesis: there is mutual influence between the two cultures of Nubians and non-Nubians in Egypt. Results supported researchers' hypothesis as they observed that the Nubians managed to reserve balance between the maintenance of their own culture and the adoption of some cultural features of the community of their new destination(s). Also, the study examined why Nubians adhere to their culture although they left their land forever. Questionnaire and interviews were used to collect data from 80 informants; 40 Nubians and 40 non-Nubians in Kom-Ombo and the two cities of Cairo and Alexandria. Results suggested that there is obvious mutual cultural impact between Nubians and non-Nubians. The findings of this study would trigger the researchers to conduct further research on minorities for the deeper understanding of the impact of/on the culture of minorities.Keywords: culture change, culture influence, culture maintenance, minority migration
Procedia PDF Downloads 2293255 Adsorption of Dyes and Iodine: Reaching Outstanding Kinetics with CuII-Based Metal–Organic Nanoballs
Authors: Eder Amayuelas, Begoña Bazán, M. Karmele Urtiaga, Gotzone Barandika, María I. Arriortua
Abstract:
Metal Organic Frameworks (MOFs) have attracted great interest in recent years, taking a lead role in the field of catalysis, drug delivery, sensors and absorption. In the past decade, promising results have been reported specifically in the field of adsorption, based on the topology and chemical features of this type of porous material. Thus, its application in industry and environment for the adsorption of pollutants is presented as a response to an increasingly important need. In this area, organic dyes are nowadays widely used in many industries including medicine, textile, leather, printing and plastics. The consequence of this fact is that dyes are present as emerging pollutants in soils and water where they remain for long periods of time due to their high stability, with a potential risk of toxicity in wildlife and in humans. On the other hand, the presence of iodine in soils, water and gas as a nuclear activity pollutant product or its extended use as a germicide is still a problem in many countries, which indicates the imperative need for its removal. In this context, this work presents the characterization as an adsorbent of the activated compound αMOP@Ei2-1 obtained from the already reported [Cu₂₄(m-BDC)₂₄(DMF)₂₀(H₂O)₄]•24DMF•40H₂O (MOP@Ei2-1), where m-BDC is the 1,3-benzenedicarboxylic ligand and DMF is N,N′-dimethylformamide. The structure of MOP@Ei2-1 consists of Cu24 clusters arranged in such a way that 12 paddle-wheels are connected through m-BDC ligands. The clusters exhibit an internal cavity where crystallization molecules of DMF and water are located. Adsorption of dyes and iodine as pollutant examples has been carried out, focusing attention on the kinetics of the rapid process.Keywords: adsorption, organic dyes, iodine, metal organic frameworks
Procedia PDF Downloads 2813254 Determination of the Quantity of Water Absorbed by the Plant When Irrigating by Infiltration in Arid Regions (Case of Ouargla in Algeria)
Authors: Mehdi Benlarbi, Dalila Oulhaci
Abstract:
Several physical, human and economic factors come into play in the choice of an irrigation system for developing arid and semi-arid regions. Since it is impossible to define or weight quantitatively all the relevant factors in each case, the choice of the system is often based on subjective preferences rather than explicit analysis. Over the past decade, irrational irrigation in the Ouargla region has evolved to a certain extent based largely on water wastage and which may pose risks to the environment both off-site and at the site. In the whole region, the environment is damaged by excess water because the water tables that tend to be high form swamps that pollute nature on the surface. The purpose of our work is a comparison between sprinkler irrigation and drip irrigation using bottles. By irrigating with the aid of the bottle and giving a volume of 4 liters with a flow rate of one (1) liter per hour, the watering dose received varies between 6 and 7 mm without infiltration losses. And for the case of sprinkler irrigation, the dose received may not exceed 2.5mm. E in some cases, we have a quantity of water lost by infiltration. This shows that irrigation using the bottle is much more efficient than sprinkling. Because, on the one hand, a large amount of water is absorbed by the plant and on the other hand, there is no loss by infiltration. The results obtained are very significant because, on the one hand, we reuse local products, and on the other hand, as the bottles are buried, we avoid water losses by evaporation, especially in dry periods and salinization.Keywords: resources, water, arid, evaporation, infiltration
Procedia PDF Downloads 803253 Approach for Updating a Digital Factory Model by Photogrammetry
Authors: R. Hellmuth, F. Wehner
Abstract:
Factory planning has the task of designing products, plants, processes, organization, areas, and the construction of a factory. The requirements for factory planning and the building of a factory have changed in recent years. Regular restructuring is becoming more important in order to maintain the competitiveness of a factory. Restrictions in new areas, shorter life cycles of product and production technology as well as a VUCA world (Volatility, Uncertainty, Complexity & Ambiguity) lead to more frequent restructuring measures within a factory. A digital factory model is the planning basis for rebuilding measures and becomes an indispensable tool. Short-term rescheduling can no longer be handled by on-site inspections and manual measurements. The tight time schedules require up-to-date planning models. Due to the high adaptation rate of factories described above, a methodology for rescheduling factories on the basis of a modern digital factory twin is conceived and designed for practical application in factory restructuring projects. The focus is on rebuild processes. The aim is to keep the planning basis (digital factory model) for conversions within a factory up to date. This requires the application of a methodology that reduces the deficits of existing approaches. The aim is to show how a digital factory model can be kept up to date during ongoing factory operation. A method based on photogrammetry technology is presented. The focus is on developing a simple and cost-effective solution to track the many changes that occur in a factory building during operation. The method is preceded by a hardware and software comparison to identify the most economical and fastest variant.Keywords: digital factory model, photogrammetry, factory planning, restructuring
Procedia PDF Downloads 1203252 An Overview of Informal Settlement Upgrading Strategies in Kabul City and the Need for an Integrated Multi-Sector Upgrading Model
Authors: Bashir Ahmad Amiri, Nsenda Lukumwena
Abstract:
The developing economies are experiencing an unprecedented rate of urbanization, mainly the urbanization of poverty which is leading to sprawling of slums and informal settlement. Kabul, being the capital and primate city of Afghanistan is grossly encountered to the informal settlement where the majority of the people consider to be informal. Despite all efforts to upgrade and minimize the growth of these settlements, they are growing rapidly. Various interventions have been taken by the government and some international organizations from physical upgrading to urban renewal, but none of them have succeeded to solve the issue of informal settlement. The magnitude of the urbanization and the complexity of informal settlement in Kabul city, and the institutional and capital constraint of the government calls for integration and optimization of currently practiced strategies. This paper provides an overview of informal settlement formation and the conventional upgrading strategies in Kabul city to identify the dominant/successful practices and rationalize the conventional upgrading modes. For this purpose, Hothkhel has been selected as the case study, since it represents the same situation of major informal settlements of the city. Considering the existing potential and features of the Hothkhel and proposed land use by master plan this paper intends to find a suitable upgrading mode for the study area and finally to scale up the model for the city level upgrading. The result highlights that the informal settlements of Kabul city have high (re)development capacity for accepting the additional room without converting the available agricultural area to built-up. The result also indicates that the integrated multi-sector upgrading has the scale-up potential to increase the reach of beneficiaries and to ensure an inclusive and efficient urbanization.Keywords: informal settlement, upgrading strategies, Kabul city, urban expansion, integrated multi-sector, scale-up
Procedia PDF Downloads 1773251 The Study of Rapid Entire Body Assessment and Quick Exposure Check Correlation in an Engine Oil Company
Authors: Mohammadreza Ashouria, Majid Motamedzadeb
Abstract:
Rapid Entire Body Assessment (REBA) and Quick Exposure Check (QEC) are two general methods to assess the risk factors of work-related musculoskeletal disorders (WMSDs). This study aimed to compare ergonomic risk assessment outputs from QEC and REBA in terms of agreement in distribution of postural loading scores based on analysis of working postures. This cross-sectional study was conducted in an engine oil company in which 40 jobs were studied. A trained occupational health practitioner observed all jobs. Job information was collected to ensure the completion of ergonomic risk assessment tools, including QEC, and REBA. The result revealed that there was a significant correlation between final scores (r=0.731) and the action levels (r =0.893) of two applied methods. Comparison between the action levels and final scores of two methods showed that there was no significant difference among working departments. Most of the studied postures acquired low and moderate risk level in QEC assessment (low risk=20%, moderate risk=50% and High risk=30%) and in REBA assessment (low risk=15%, moderate risk=60% and high risk=25%).There is a significant correlation between two methods. They have a strong correlation in identifying risky jobs and determining the potential risk for incidence of WMSDs. Therefore, there is a possibility for researchers to apply interchangeably both methods, for postural risk assessment in appropriate working environments.Keywords: observational method, QEC, REBA, musculoskeletal disorders
Procedia PDF Downloads 3643250 3D Numerical Study of Tsunami Loading and Inundation in a Model Urban Area
Authors: A. Bahmanpour, I. Eames, C. Klettner, A. Dimakopoulos
Abstract:
We develop a new set of diagnostic tools to analyze inundation into a model district using three-dimensional CFD simulations, with a view to generating a database against which to test simpler models. A three-dimensional model of Oregon city with different-sized groups of building next to the coastline is used to run calculations of the movement of a long period wave on the shore. The initial and boundary conditions of the off-shore water are set using a nonlinear inverse method based on Eulerian spatial information matching experimental Eulerian time series measurements of water height. The water movement is followed in time, and this enables the pressure distribution on every surface of each building to be followed in a temporal manner. The three-dimensional numerical data set is validated against published experimental work. In the first instance, we use the dataset as a basis to understand the success of reduced models - including 2D shallow water model and reduced 1D models - to predict water heights, flow velocity and forces. This is because models based on the shallow water equations are known to underestimate drag forces after the initial surge of water. The second component is to identify critical flow features, such as hydraulic jumps and choked states, which are flow regions where dissipation occurs and drag forces are large. Finally, we describe how future tsunami inundation models should be modified to account for the complex effects of buildings through drag and blocking.Financial support from UCL and HR Wallingford is greatly appreciated. The authors would like to thank Professor Daniel Cox and Dr. Hyoungsu Park for providing the data on the Seaside Oregon experiment.Keywords: computational fluid dynamics, extreme events, loading, tsunami
Procedia PDF Downloads 1183249 Aquatic Environmental Effects of Black Shale in Eastern Kentucky through the Measurement of Chemical and Physical Properties
Authors: Mitchell T. Grothaus, Cory Grigsby, Timothy S. Hare
Abstract:
This study aims to determine if there is a relationship between elevated cancer risks in eastern Kentucky and the environmental effects of black shale. Previous research shows that black shale formations, such as those in eastern Kentucky contain high levels of toxic elements including arsenic and radon compared to average rocks and sediment. Similarly, the population of eastern Kentucky has higher rates of many health conditions, including lung cancer and cardiovascular disease, than surrounding regions. These poor health outcomes are typically explained in relation to social, economic, behavioral, and healthcare factors. The rates of many conditions, however, have not decreased as these factors improve with regional development. Black shale is known to affect environmental conditions such as by increasing radiation levels and heavy metal toxicity. We are mapping the effects of black shale through monitoring radiation, microbes, and chemical standards of water sources. In this presentation, we report on our measuring pH, dissolved oxygen, total dissolved solids, conductivity, temperature, and discharge and comparison with water quality standards from the Kentucky Department for Environmental Protection. The conditions of water sources combined with an environmental survey of the surrounding areas provide a greater understanding of why the people in eastern Kentucky face the current health issues.Keywords: black shale, eastern Kentucky, environmental impact, water quality
Procedia PDF Downloads 1673248 Foodborne Pathogens in Different Types of Milk: From the Microbiome to Risk Assessment
Authors: Pasquali Frederique, Manfreda Chiara, Crippa Cecilia, Indio Valentina, Ianieri Adriana, De Cesare Alessandra
Abstract:
Microbiological hazards can be transmitted to humans through milk. In this study, we compared the microbiome composition and presence of foodborne pathogens in organic milk (n=6), organic hay milk (n=6), standard milk (n=6) and high-quality milk (n=6). The milk samples were collected during six samplings between December 2022 to January 2023 and between April and May 2024 to take into account seasonal variations. The 24 milk samples were submitted to DNA extraction and library preparation before shotgun sequencing on the Illumina HiScan™ SQ System platform. The total sequencing output was 600 GB. In all the milk samples, the phyla with the highest relative abundances were Pseudomonadota, Bacillota, Ascomycota, Actinomycetota and Apicomplexa, while the most represented genera were Pseudomonas, Streptococcus, Geotrichum, Acinetobacter and Babesia. The alpha and beta diversity indexes showed a clear separation between the microbiome of high-quality milk and those of the other milk types. Moreover, in the high-quality milk, the relative abundance of Staphylococcus (4.4%), Campylobacter (4.5%), Bacillus (2.5%), Enterococcus (2.4%), Klebsiella (1.3%) and Escherichia (0 .7%) was significantly higher in comparison to other types of milk. On the contrary, the relative abundance of Geotrichum (0.5%) was significantly lower. The microbiome results collected in this study showed significant differences in terms of the relative abundance of bacteria genera, including foodborne pathogen species. These results should be incorporated into risk assessment models tailored to different types of milk.Keywords: raw milk, foodborne pathogens, microbiome, risk assessment
Procedia PDF Downloads 303247 Climate Trends, Variability, and Impacts of El Niño-Southern Oscillation on Rainfall Amount in Ethiopia
Authors: Zerihun Yohannes Amare, Belayneh Birku Geremew, Nigatu Melise Kebede, Sisaynew Getahun Amera
Abstract:
In Ethiopia, agricultural production is predominantly rainfed. The El Niño Southern Oscillation (ENSO) is the driver of climate variability, which affects the agricultural production system in the country. This paper aims to study trends, variability of rainfall, and impacts of El Niño Southern Oscillation (ENSO) on rainfall amount. The study was carried out in Ethiopia's Western Amhara National Regional State, which features a variety of seasons that characterize the nation. Monthly rainfall data were collected from fifteen meteorological stations of Western Amhara. Selected El Niño and La Niña years were also extracted from National Oceanic and Atmospheric Administration (NOAA) from 1986 to 2015. Once the data quality was checked and inspected, the monthly rainfall data of the selected stations were arranged in Microsoft Excel Spreadsheet and analyzed using XLSTAT software. The coefficient of variation and the Mann-Kendall non-parametric statistical test was employed to analyze trends and variability of rainfall and temperature. The long-term recorded annual rainfall data indicated that there was an increasing trend from 1986 to 2015 insignificantly. The rainfall variability was less (Coefficient of Variation, CV = 8.6%); also, the mean monthly rainfall of Western Amhara decreased during El Niño years and increased during La Niña years, especially in the rainy season (JJAS) over 30 years. This finding will be useful to suggest possible adaptation strategies and efficient use of resources during planning and implementation.Keywords: rainfall, Mann-Kendall test, El Niño, La Niña, Western Amhara, Ethiopia
Procedia PDF Downloads 1003246 Influence of Layer-by-Layer Coating Parameters on the Properties of Hybrid Membrane for Water Treatment
Authors: Jenny Radeva, Anke-Gundula Roth, Christian Goebbert, Robert Niestroj-Pahl, Lars Daehne, Axel Wolfram, Juergen WIese
Abstract:
The presented investigation studies the correlation between the process parameters of Layer-by-Layer (LbL) coatings and properties of the produced hybrid membranes for water treatment. The coating of alumina ceramic support membrane with polyelectrolyte multilayers on top results in hybrid membranes with increased fouling resistant behavior, high retention (up to 90%) of salt ions and various pharmaceuticals, selectivity to various organic molecules as known from LbL coated polyether sulfone membranes and the possibility of pH response control. Chosen polyelectrolytes were added to the support using the LbL-coating process. Parameters like the type of polyelectrolyte, ionic strength, and pH were varied in order to find the most suitable process conditions and to study how they influence the properties of the final product. The applied LbL-films was investigated in respect to its homogeneity and penetration depth. The analysis of the layer buildup was performed using fluorescence labeled polyelectrolyte molecules and Confocal Laser Scanning Microscopy as well as Scanning and Transmission Electron Microscopy. Furthermore, the influence of the coating parameters on the porosity, surface potential, retention, and permeability of the developed hybrid membranes were estimated. In conclusion, a comparison was drawn between the filtration performance of the uncoated alumina ceramic membrane and modified hybrid membranes.Keywords: water treatment, membranes, ceramic membranes, hybrid membranes, layer-by-layer modification
Procedia PDF Downloads 1883245 Thermal Insulating Silicate Materials Suitable for Thermal Insulation and Rehabilitation Structures
Authors: Jitka Hroudová, Martin Sedlmajer, Jiří Zach
Abstract:
Problems insulation of building structures is often closely connected with the problem of moisture remediation. In the case of historic buildings or if only part of the redevelopment of envelope of structures, it is not possible to apply the classical external thermal insulation composite systems. This application is mostly effective thermal insulation plasters with high porosity and controlled capillary properties which assures improvement of thermal properties construction, its diffusion openness towards the external environment and suitable treatment capillary properties of preventing the penetration of liquid moisture and salts thereof toward the outer surface of the structure. With respect to the current trend of reducing the energy consumption of building structures and reduce the production of CO2 is necessary to develop capillary-active materials characterized by their low density, low thermal conductivity while maintaining good mechanical properties. The aim of researchers at the Faculty of Civil Engineering, Brno University of Technology is the development and study of hygrothermal behaviour of optimal materials for thermal insulation and rehabilitation of building structures with the possible use of alternative, less energy demanding binders in comparison with conventional, frequently used binder, which represents cement. The paper describes the evaluation of research activities aimed at the development of thermal insulation and repair materials using lightweight aggregate and alternative binders such as metakaolin and finely ground fly ash.Keywords: thermal insulating plasters, rehabilitation materials, thermal conductivity, lightweight aggregate, alternative binders.
Procedia PDF Downloads 3063244 Reducing the Imbalance Penalty Through Artificial Intelligence Methods Geothermal Production Forecasting: A Case Study for Turkey
Authors: Hayriye Anıl, Görkem Kar
Abstract:
In addition to being rich in renewable energy resources, Turkey is one of the countries that promise potential in geothermal energy production with its high installed power, cheapness, and sustainability. Increasing imbalance penalties become an economic burden for organizations since geothermal generation plants cannot maintain the balance of supply and demand due to the inadequacy of the production forecasts given in the day-ahead market. A better production forecast reduces the imbalance penalties of market participants and provides a better imbalance in the day ahead market. In this study, using machine learning, deep learning, and, time series methods, the total generation of the power plants belonging to Zorlu Natural Electricity Generation, which has a high installed capacity in terms of geothermal, was estimated for the first one and two weeks of March, then the imbalance penalties were calculated with these estimates and compared with the real values. These modeling operations were carried out on two datasets, the basic dataset and the dataset created by extracting new features from this dataset with the feature engineering method. According to the results, Support Vector Regression from traditional machine learning models outperformed other models and exhibited the best performance. In addition, the estimation results in the feature engineering dataset showed lower error rates than the basic dataset. It has been concluded that the estimated imbalance penalty calculated for the selected organization is lower than the actual imbalance penalty, optimum and profitable accounts.Keywords: machine learning, deep learning, time series models, feature engineering, geothermal energy production forecasting
Procedia PDF Downloads 1143243 Layer-Level Feature Aggregation Network for Effective Semantic Segmentation of Fine-Resolution Remote Sensing Images
Authors: Wambugu Naftaly, Ruisheng Wang, Zhijun Wang
Abstract:
Models based on convolutional neural networks (CNNs), in conjunction with Transformer, have excelled in semantic segmentation, a fundamental task for intelligent Earth observation using remote sensing (RS) imagery. Nonetheless, tokenization in the Transformer model undermines object structures and neglects inner-patch local information, whereas CNNs are unable to simulate global semantics due to limitations inherent in their convolutional local properties. The integration of the two methodologies facilitates effective global-local feature aggregation and interactions, potentially enhancing segmentation results. Inspired by the merits of CNNs and Transformers, we introduce a layer-level feature aggregation network (LLFA-Net) to address semantic segmentation of fine-resolution remote sensing (FRRS) images for land cover classification. The simple yet efficient system employs a transposed unit that hierarchically utilizes dense high-level semantics and sufficient spatial information from various encoder layers through a layer-level feature aggregation module (LLFAM) and models global contexts using structured Transformer blocks. Furthermore, the decoder aggregates resultant features to generate rich semantic representation. Extensive experiments on two public land cover datasets demonstrate that our proposed framework exhibits competitive performance relative to the most recent frameworks in semantic segmentation.Keywords: land cover mapping, semantic segmentation, remote sensing, vision transformer networks, deep learning
Procedia PDF Downloads 163242 C-eXpress: A Web-Based Analysis Platform for Comparative Functional Genomics and Proteomics in Human Cancer Cell Line, NCI-60 as an Example
Authors: Chi-Ching Lee, Po-Jung Huang, Kuo-Yang Huang, Petrus Tang
Abstract:
Background: Recent advances in high-throughput research technologies such as new-generation sequencing and multi-dimensional liquid chromatography makes it possible to dissect the complete transcriptome and proteome in a single run for the first time. However, it is almost impossible for many laboratories to handle and analysis these “BIG” data without the support from a bioinformatics team. We aimed to provide a web-based analysis platform for users with only limited knowledge on bio-computing to study the functional genomics and proteomics. Method: We use NCI-60 as an example dataset to demonstrate the power of the web-based analysis platform and data delivering system: C-eXpress takes a simple text file that contain the standard NCBI gene or protein ID and expression levels (rpkm or fold) as input file to generate a distribution map of gene/protein expression levels in a heatmap diagram organized by color gradients. The diagram is hyper-linked to a dynamic html table that allows the users to filter the datasets based on various gene features. A dynamic summary chart is generated automatically after each filtering process. Results: We implemented an integrated database that contain pre-defined annotations such as gene/protein properties (ID, name, length, MW, pI); pathways based on KEGG and GO biological process; subcellular localization based on GO cellular component; functional classification based on GO molecular function, kinase, peptidase and transporter. Multiple ways of sorting of column and rows is also provided for comparative analysis and visualization of multiple samples.Keywords: cancer, visualization, database, functional annotation
Procedia PDF Downloads 6233241 Comparison of Serological and Molecular Diagnosis of Cerebral Toxoplasmosis in Blood and Cerebrospinal Fluid in HIV Infected Patients
Authors: Berredjem Hajira, Benlaifa Meriem, Becheker Imene, Bardi Rafika, Djebar Med Reda
Abstract:
Recent acquired or reactivation T.gondii infection is a serious complication in HIV patients. Classical serological diagnosis relies on the detection of anti-Toxoplasma immunoglobulin ; however, serology may be unreliable in HIV immunodeficient patients who fail to produce significant titers of specific antibodies. PCR assays allow a rapid diagnosis of Toxoplasma infection. In this study, we compared the value of the PCR for diagnosing active toxoplasmosis in cerebrospinal fluid and blood samples from HIV patients. Anti-Toxoplasma antibodies IgG and IgM titers were determined by ELISA. In parallel, nested PCR targeting B1 gene and conventional PCR-ELISA targeting P30 gene were used to detect T. gondii DNA in 25 blood samples and 12 cerebrospinal fluid samples from patients in whom toxoplasmic encephalitis was confirmed by clinical investigations. A total of 15 negative controls were used. Serology did not contribute to confirm toxoplasmic infection, as IgG and IgM titers decreased early. Only 8 out 25 blood samples and 5 out 12 cerebrospinal fluid samples PCRs yielded a positive result. 5 patients with confirmed toxoplasmosis had positive PCR results in either blood or cerebrospinal fluid samples. However, conventional nested B1 PCR gave best results than the P30 gene one for the detection of T.gondii DNA in both samples. All samples from control patients were negative. This study demonstrates the unusefulness of the serological tests and the high sensitivity and specificity of PCR in the diagnosis of toxoplasmic encephalitis in HIV patients.Keywords: cerebrospinal fluid, HIV, Toxoplasmosis, PCR
Procedia PDF Downloads 3803240 Impact of Regulation on Trading in Financial Derivatives in Europe
Authors: H. Florianová, J. Nešleha
Abstract:
Financial derivatives are considered to be risky investment instruments which could possibly bring another financial crisis. As prevention, European Union and its member states have released new legal acts adjusting this area of law in recent years. There have been several cases in history of capital markets worldwide where it was shown that legislature may affect behavior of subjects on capital markets. In our paper we analyze main events on selected European stock exchanges in order to apply them on three chosen markets - Czech capital market represented by Prague Stock Exchange, German capital market represented by Deutsche Börse and Polish capital market represented by Warsaw Stock Exchange. We follow time series of development of the sum of listed derivatives on these three stock exchanges in order to evaluate popularity of those exchanges. Afterwards we compare newly listed derivatives in relation to the speed of development of these exchanges. We also make a comparison between trends in derivatives and shares development. We explain how a legal regulation may affect situation on capital markets. If the regulation is too strict, potential investors or traders are not willing to undertake it and move to other markets. On the other hand, if the regulation is too vague, trading scandals occur and the market is not reliable from the prospect of potential investors or issuers. We see that making the regulation stricter usually discourages subjects to stay on the market immediately although making the regulation vaguer to interest more subjects is usually much slower process.Keywords: capital markets, financial derivatives, investors' behavior, regulation
Procedia PDF Downloads 272