Search results for: digital tools
955 Development of an Automatic Calibration Framework for Hydrologic Modelling Using Approximate Bayesian Computation
Authors: A. Chowdhury, P. Egodawatta, J. M. McGree, A. Goonetilleke
Abstract:
Hydrologic models are increasingly used as tools to predict stormwater quantity and quality from urban catchments. However, due to a range of practical issues, most models produce gross errors in simulating complex hydraulic and hydrologic systems. Difficulty in finding a robust approach for model calibration is one of the main issues. Though automatic calibration techniques are available, they are rarely used in common commercial hydraulic and hydrologic modelling software e.g. MIKE URBAN. This is partly due to the need for a large number of parameters and large datasets in the calibration process. To overcome this practical issue, a framework for automatic calibration of a hydrologic model was developed in R platform and presented in this paper. The model was developed based on the time-area conceptualization. Four calibration parameters, including initial loss, reduction factor, time of concentration and time-lag were considered as the primary set of parameters. Using these parameters, automatic calibration was performed using Approximate Bayesian Computation (ABC). ABC is a simulation-based technique for performing Bayesian inference when the likelihood is intractable or computationally expensive to compute. To test the performance and usefulness, the technique was used to simulate three small catchments in Gold Coast. For comparison, simulation outcomes from the same three catchments using commercial modelling software, MIKE URBAN were used. The graphical comparison shows strong agreement of MIKE URBAN result within the upper and lower 95% credible intervals of posterior predictions as obtained via ABC. Statistical validation for posterior predictions of runoff result using coefficient of determination (CD), root mean square error (RMSE) and maximum error (ME) was found reasonable for three study catchments. The main benefit of using ABC over MIKE URBAN is that ABC provides a posterior distribution for runoff flow prediction, and therefore associated uncertainty in predictions can be obtained. In contrast, MIKE URBAN just provides a point estimate. Based on the results of the analysis, it appears as though ABC the developed framework performs well for automatic calibration.Keywords: automatic calibration framework, approximate bayesian computation, hydrologic and hydraulic modelling, MIKE URBAN software, R platform
Procedia PDF Downloads 312954 Export and Import Indicators of Georgian Agri-food Products during the Pandemic: Challenges and Opportunities
Authors: Eteri Kharaishvili
Abstract:
Introduction. The paper analyzes the main indicators of export and import of Georgian agri-food products; identifies positive and negative trends under the pandemic; based on the revealed problemssubstantiates the need formodernization ofin agri-food sector. It is argued that low production and productivity rates of food products negatively impact achieving the optimal export-to-import ratio; therefore, it leads toincreaseddependence on other countries andreduces the level of food security. Research objectives. The objective of the research is to identify the key challenges based on the analysis of export-import indicators of Georgian food products during the pandemic period and develop recommendations on the possibilities of post-pandemic perspectives. Research methods. Various theoretical and methodological research tools are used in the paper; in particular, a desk research is carried out on the research topic; endogenous and exogenous variables affecting export and import are determined through factor analysis; SWOT and PESTEL analysis are used to identify development opportunities; selection and groupingof data, identification of similarities and differences is carried outby using analysis, synthesis, sampling, induction and other methods; a qualitative study is conducted based on a survey of agri-food experts and exporters for clarifying the factors that impede export-import flows. Contributions. The factors that impede the export of Georgian agri-food products in the short run under COVID-19 pandemic are identified. These are: reduced income of farmers, delays in the supply of raw materials and supplies to the agri-food sectorfrom the neighboring industries, as well as in harvesting, processing, marketing, transportation, and other sectors; increased indirect costs, etc. The factors that impede the export in the long run areas follows loss of public confidence in the industry, risk of losing positions in traditional markets, etc. Conclusions are made on the problems in the field of export and import of Georgian agri-food products in terms of the pandemic; development opportunities are evaluated based on the analysis of the agri-food sector potential. Recommendations on the development opportunities for export and import of Georgian agri-food products in the post-pandemic period are proposed.Keywords: agri-food products, export, and import, pandemic period, hindering factor, development potential
Procedia PDF Downloads 146953 Victimization in Schizophrenia: A Cross-Sectional Prospective Study
Authors: Mehmet Budak, Mehmet Fatih Ustundag
Abstract:
Objectives: In this research, we studied the extent of exposure to physical violence and committing violence in patients diagnosed with schizophrenia in comparison to a control group consisting of patients with psychiatric diseases other than psychotic and mood disorders. Method: Between August 2019 and October 2019, a total of 100 hospitalized patients diagnosed with schizophrenia (clinically in remission, Brief Psychiatric Rate Scale < 30) were sequentially studied while undergoing inpatient treatment at Erenkoy Mental Health Training and Research Hospital. From the outpatient clinic, 50 patients with psychiatric disorders other than psychotic disorders or mood disorders were consecutively included as a control group. All participants were evaluated by the sociodemographic data that also questions the history of violence, physical examination, bilateral comparative hand, and forearm anterior-posterior and lateral radiography. Results: While 59% of patients with schizophrenia and 28% of the control group stated that they were exposed to physical violence at least once in a lifetime (p < 0,001); a defensive wound or fracture was detected in 29% of patients with schizophrenia and 2% of the control group (p < 0.001). On the other hand, 61% of patients diagnosed with schizophrenia, and 32% of the control group expressed that they committed physical violence at least once in a lifetime (p: 0.001). A self-destructive wound or fracture was detected in 53% of the patients with schizophrenia and 24% of the control group (p: 0,001). In the schizophrenia group, the rate of committing physical violence is higher in those with substance use compared to those without substance use (p:0.049). Also, wounds and bone fractures (boxer’s fracture) resulting from self-injury are more common in schizophrenia patients with substance use (p:0,002). In the schizophrenia group, defensive wounds and parry fractures (which are located in the hand, forearm, and arm usually occur as a result of a trial to shield the face against an aggressive attack and are known to be the indicators of interpersonal violence) are higher in those with substance use compared to those who do not (p:0,007). Conclusion: This study shows that exposure to physical violence and the rate of violence is higher in patients with schizophrenia compared to the control group. It is observed that schizophrenia patients who are stigmatized as being aggressive are more exposed to violence. Substance use in schizophrenia patients increases both exposure to physical violence and the use of physical violence. Physical examination and anamnesis that question violence are important tools to reveal the exposure to violence in patients. Furthermore, some specific bone fractures and wounds could be used to detect victimization even after plenty of time passes.Keywords: fracture, physical violence, schizophrenia, substance use
Procedia PDF Downloads 172952 The Quantum Theory of Music and Human Languages
Authors: Mballa Abanda Luc Aurelien Serge, Henda Gnakate Biba, Kuate Guemo Romaric, Akono Rufine Nicole, Zabotom Yaya Fadel Biba, Petfiang Sidonie, Bella Suzane Jenifer
Abstract:
The main hypotheses proposed around the definition of the syllable and of music, of the common origin of music and language, should lead the reader to reflect on the cross-cutting questions raised by the debate on the notion of universals in linguistics and musicology. These are objects of controversy, and there lies its interest: the debate raises questions that are at the heart of theories on language. It is an inventive, original, and innovative research thesis. A contribution to the theoretical, musicological, ethno musicological, and linguistic conceptualization of languages, giving rise to the practice of interlocution between the social and cognitive sciences, the activities of artistic creation, and the question of modeling in the human sciences: mathematics, computer science, translation automation, and artificial intelligence. When you apply this theory to any text of a folksong of a world-tone language, you do not only piece together the exact melody, rhythm, and harmonies of that song as if you knew it in advance but also the exact speaking of this language. The author believes that the issue of the disappearance of tonal languages and their preservation has been structurally resolved, as well as one of the greatest cultural equations related to the composition and creation of tonal, polytonal, and random music. The experimentation confirming the theorization, I designed a semi-digital, semi-analog application that translates the tonal languages of Africa (about 2,100 languages) into blues, jazz, world music, polyphonic music, tonal and anatonal music, and deterministic and random music). To test this application, I use music reading and writing software that allows me to collect the data extracted from my mother tongue, which is already modeled in the musical staves saved in the ethnographic (semiotic) dictionary for automatic translation ( volume 2 of the book). The translation is done (from writing to writing, from writing to speech, and from writing to music). Mode of operation: you type a text on your computer, a structured song (chorus-verse), and you command the machine a melody of blues, jazz, and world music or variety, etc. The software runs, giving you the option to choose harmonies, and then you select your melody.Keywords: language, music, sciences, quantum entenglement
Procedia PDF Downloads 82951 Automated Transformation of 3D Point Cloud to BIM Model: Leveraging Algorithmic Modeling for Efficient Reconstruction
Authors: Radul Shishkov, Orlin Davchev
Abstract:
The digital era has revolutionized architectural practices, with building information modeling (BIM) emerging as a pivotal tool for architects, engineers, and construction professionals. However, the transition from traditional methods to BIM-centric approaches poses significant challenges, particularly in the context of existing structures. This research introduces a technical approach to bridge this gap through the development of algorithms that facilitate the automated transformation of 3D point cloud data into detailed BIM models. The core of this research lies in the application of algorithmic modeling and computational design methods to interpret and reconstruct point cloud data -a collection of data points in space, typically produced by 3D scanners- into comprehensive BIM models. This process involves complex stages of data cleaning, feature extraction, and geometric reconstruction, which are traditionally time-consuming and prone to human error. By automating these stages, our approach significantly enhances the efficiency and accuracy of creating BIM models for existing buildings. The proposed algorithms are designed to identify key architectural elements within point clouds, such as walls, windows, doors, and other structural components, and to translate these elements into their corresponding BIM representations. This includes the integration of parametric modeling techniques to ensure that the generated BIM models are not only geometrically accurate but also embedded with essential architectural and structural information. Our methodology has been tested on several real-world case studies, demonstrating its capability to handle diverse architectural styles and complexities. The results showcase a substantial reduction in time and resources required for BIM model generation while maintaining high levels of accuracy and detail. This research contributes significantly to the field of architectural technology by providing a scalable and efficient solution for the integration of existing structures into the BIM framework. It paves the way for more seamless and integrated workflows in renovation and heritage conservation projects, where the accuracy of existing conditions plays a critical role. The implications of this study extend beyond architectural practices, offering potential benefits in urban planning, facility management, and historic preservation.Keywords: BIM, 3D point cloud, algorithmic modeling, computational design, architectural reconstruction
Procedia PDF Downloads 70950 The Role of Online Platforms in Economic Growth and the Introduction of Local Culture in Tourist Areas
Authors: Maryam Nzari
Abstract:
Today, with the advancement of Internet technology, one of the tools used by humans is a tool that allows them to do what they need easily. Online platforms in different forms and by providing different services make it possible for users to communicate with each other and users with platforms. Audience communication with mass media is not the same as in the past. Today the conditions are different; With online platforms that provide the latest news minute by minute, he has access to all the content and can choose more quickly and easily. According to professionals Galloway, Apple, Amazon, Facebook and Google companies create a wide range. They are among the products and services that are connected with the daily life of billions of people all over the planet. Over time, platforms gain high economic value and in this way gain power that will influence the social, cultural, economic and political aspects of people’s lives. As a result of the effects of the process of platformization on all areas of individual and collective life, we now live in a platform society, which communicates It is close to “platform politics”. Nowadays, with social media platforms, users can interact with many people and people can share their data on various topics with others in this space. In this research, what will be investigated is the role of these online platforms in economic growth and the introduction of local culture areas in tourist areas. Tourism in a region is linked with various factors; One of the important factors that attract tourists to a region is its culture, and on the other hand, this culture can also affect economic growth. Without a proper understanding of the culture of these tourist areas, it is not possible to plan properly for the growth of the tourism industry and the subsequent increase in economic growth. The interaction of local people and tourists will have social and cultural effects on each other and will give them the opportunity to get to know each other. Therefore, the purpose of this research is to examine issues such as the role that online platforms play in cultural interaction in tourist areas and to understand that online platforms are only seeking to show the good aspects of a region and then generate enough extra income or that platforms can They play a role beyond what we imagine and introduce the culture of a region in a proper way so that we don’t see disagreements in the tourism planning of that region. in this article It has been tried by using library and field methods Answer the questions.Keywords: online platforms, economic growth, culture Indigenous, tourism
Procedia PDF Downloads 62949 Rethinking Robots through Living Skin
Authors: Hanna Gerda Brøndal
Abstract:
The development of “living” robotic skin interrupts binary categorizations between nature/artifice and introduces a transmaterial entity that embraces various forms of aesthetic phenomena regarding embodiment, boundaries, and nonbinary modes of existence. Examining these phenomena will lead to a nuanced definition of what makes a robot. In 2024, a Japanese research group from the University of Tokyo invented an organic kind of robotic skin, referred to as the ‘skin equivalent.’ This skin is composed of living human skin cells cultivated in a laboratory and attached to a robotic face. The research group anticipates that the skin equivalent may, in the future, possess self-healing properties and tactile sensations akin to human skin. This specific skin method extends beyond the field of engineering and into aesthetic considerations. Robots should not be regarded solely as utilitarian tools; for millennia, they have been cultural phenomena, often anthropomorphized and designed to mimic life. These themes echo artistic traditions throughout history, where images and artifacts have been imbued with a sense of vitality through their matter and animating capabilities. Consequently, this study applies an art historical methodology that integrates insights from skin studies, phenomenology, and new materialism to analyze the skin equivalent from the University of Tokyo. Together, these perspective have informed the following results: 1) The replication of tactile sensations fosters a nuanced comprehension of consciousness that extends to nonhuman lifeforms; 2) the fusion of biological and man-made materials transgresses cognitive boundaries, which evokes eerie feelings in humans; and 3) the materials comprising the skin equivalent can be perceived as agential, living entities that form a transmateriality obliterating conventional binaries and deconstructing the very concept of nature. Categorizations such as nature/artifice and life/death begin to blur, which shifts the understanding of robots from what they are to what they do; they are defined by their performative enactments – actively shaping and being shaped by their interactions with humans, materials, and environments to co-constitute new meanings and realities; a reality where being either exclusively biological or technological ceases to hold relevance.Keywords: embodiment, new materialism, robotic skin, transmateriality
Procedia PDF Downloads 4948 Revolutionizing Healthcare Facility Maintenance: A Groundbreaking AI, BIM, and IoT Integration Framework
Authors: Mina Sadat Orooje, Mohammad Mehdi Latifi, Behnam Fereydooni Eftekhari
Abstract:
The integration of cutting-edge Internet of Things (IoT) technologies with advanced Artificial Intelligence (AI) systems is revolutionizing healthcare facility management. However, the current landscape of hospital building maintenance suffers from slow, repetitive, and disjointed processes, leading to significant financial, resource, and time losses. Additionally, the potential of Building Information Modeling (BIM) in facility maintenance is hindered by a lack of data within digital models of built environments, necessitating a more streamlined data collection process. This paper presents a robust framework that harmonizes AI with BIM-IoT technology to elevate healthcare Facility Maintenance Management (FMM) and address these pressing challenges. The methodology begins with a thorough literature review and requirements analysis, providing insights into existing technological landscapes and associated obstacles. Extensive data collection and analysis efforts follow to deepen understanding of hospital infrastructure and maintenance records. Critical AI algorithms are identified to address predictive maintenance, anomaly detection, and optimization needs alongside integration strategies for BIM and IoT technologies, enabling real-time data collection and analysis. The framework outlines protocols for data processing, analysis, and decision-making. A prototype implementation is executed to showcase the framework's functionality, followed by a rigorous validation process to evaluate its efficacy and gather user feedback. Refinement and optimization steps are then undertaken based on evaluation outcomes. Emphasis is placed on the scalability of the framework in real-world scenarios and its potential applications across diverse healthcare facility contexts. Finally, the findings are meticulously documented and shared within the healthcare and facility management communities. This framework aims to significantly boost maintenance efficiency, cut costs, provide decision support, enable real-time monitoring, offer data-driven insights, and ultimately enhance patient safety and satisfaction. By tackling current challenges in healthcare facility maintenance management it paves the way for the adoption of smarter and more efficient maintenance practices in healthcare facilities.Keywords: artificial intelligence, building information modeling, healthcare facility maintenance, internet of things integration, maintenance efficiency
Procedia PDF Downloads 65947 D-Lysine Assisted 1-Ethyl-3-(3-Dimethylaminopropyl)Carbodiimide / N-Hydroxy Succinimide Initiated Crosslinked Collagen Scaffold with Controlled Structural and Surface Properties
Authors: G. Krishnamoorthy, S. Anandhakumar
Abstract:
The effect of D-Lysine (D-Lys) on collagen with 1-ethyl-3-(3-dimethylaminopropyl) carbodiimide(EDC)/N-hydroxysuccinimide(NHS) initiated cross linking using experimental and modelling tools are evaluated. The results of the Coll-D-Lys-EDC/NHS scaffold also indicate an increase in the tensile strength (TS), percentage of elongation (% E), denaturation temperature (Td), and decrease the decomposition rate compared to L-Lys-EDC/NHS. Scanning electron microscopic (SEM) and atomic force microscopic (AFM) analyses revealed a well ordered with properly oriented and well-aligned structure of scaffold. The D-Lys stabilizes the scaffold against degradation by collagenase than L-Lys. The cell assay showed more than 98% fibroblast viability (NIH3T3) and improved cell adhesions, protein adsorption after 72h of culture when compared with native scaffold. Cell attachment after 74h was robust, with cytoskeletal analysis showing that the attached cells were aligned along the fibers assuming a spindle-shape appearance, despite, gene expression analyses revealed no apparent alterations in mRNA levels, although cell proliferation was not adversely affected. D-Lysine (D-Lys) plays a pivotal role in the self-assembly and conformation of collagen fibrils. The D-Lys assisted EDC/NHS initiated cross-linking induces the formation of an carboxamide by the activation of the side chain -COOH group, followed by aminolysis of the O-iso acylurea intermediates by the -NH2 groups are directly joined via an isopeptides bond. This leads to the formation of intra- and inter-helical cross links. Modeling studies indicated that D-Lys bind with collagen-like peptide (CLP) through multiple H-bonding and hydrophobic interactions. Orientational changes in collagenase on CLP-D-Lys are observed which may decrease its accessibility to degradation and stabilize CLP against the action of the former. D-Lys has lowest binding energy and improved fibrillar-assembly and staggered alignment without the undesired structural stiffness and aggregations. The proteolytic machinery is not well equipped to deal with Coll-D-Lys than Coll-L-Lys scaffold. The information derived from the present study could help in designing collagenolytically stable heterochiral collagen based scaffold for biomedical applications.Keywords: collagen, collagenase, collagen like peptide, D-lysine, heterochiral collagen scaffold
Procedia PDF Downloads 393946 Geographic Information System-Based Map for Best Suitable Place for Cultivating Permanent Trees in South-Lebanon
Authors: Allaw Kamel, Al-Chami Leila
Abstract:
It is important to reduce the human influence on natural resources by identifying an appropriate land use. Moreover, it is essential to carry out the scientific land evaluation. Such kind of analysis allows identifying the main factors of agricultural production and enables decision makers to develop crop management in order to increase the land capability. The key is to match the type and intensity of land use with its natural capability. Therefore; in order to benefit from these areas and invest them to obtain good agricultural production, they must be organized and managed in full. Lebanon suffers from the unorganized agricultural use. We take south Lebanon as a study area, it is the most fertile ground and has a variety of crops. The study aims to identify and locate the most suitable area to cultivate thirteen type of permanent trees which are: apples, avocados, stone fruits in coastal regions and stone fruits in mountain regions, bananas, citrus, loquats, figs, pistachios, mangoes, olives, pomegranates, and grapes. Several geographical factors are taken as criterion for selection of the best location to cultivate. Soil, rainfall, PH, temperature, and elevation are main inputs to create the final map. Input data of each factor is managed, visualized and analyzed using Geographic Information System (GIS). Management GIS tools are implemented to produce input maps capable of identifying suitable areas related to each index. The combination of the different indices map generates the final output map of the suitable place to get the best permanent tree productivity. The output map is reclassified into three suitability classes: low, moderate, and high suitability. Results show different locations suitable for different kinds of trees. Results also reflect the importance of GIS in helping decision makers finding a most suitable location for every tree to get more productivity and a variety in crops.Keywords: agricultural production, crop management, geographical factors, Geographic Information System, GIS, land capability, permanent trees, suitable location
Procedia PDF Downloads 146945 Delineation of Green Infrastructure Buffer Areas with a Simulated Annealing: Consideration of Ecosystem Services Trade-Offs in the Objective Function
Authors: Andres Manuel Garcia Lamparte, Rocio Losada Iglesias, Marcos BoullóN Magan, David Miranda Barros
Abstract:
The biodiversity strategy of the European Union for 2030, mentions climate change as one of the key factors for biodiversity loss and considers green infrastructure as one of the solutions to this problem. In this line, the European Commission has developed a green infrastructure strategy which commits members states to consider green infrastructure in their territorial planning. This green infrastructure is aimed at granting the provision of a wide number of ecosystem services to support biodiversity and human well-being by countering the effects of climate change. Yet, there are not too many tools available to delimit green infrastructure. The available ones consider the potential of the territory to provide ecosystem services. However, these methods usually aggregate several maps of ecosystem services potential without considering possible trade-offs. This can lead to excluding areas with a high potential for providing ecosystem services which have many trade-offs with other ecosystem services. In order to tackle this problem, a methodology is proposed to consider ecosystem services trade-offs in the objective function of a simulated annealing algorithm aimed at delimiting green infrastructure multifunctional buffer areas. To this end, the provision potential maps of the regulating ecosystem services considered to delimit the multifunctional buffer areas are clustered in groups, so that ecosystem services that create trade-offs are excluded in each group. The normalized provision potential maps of the ecosystem services in each group are added to obtain a potential map per group which is normalized again. Then the potential maps for each group are combined in a raster map that shows the highest provision potential value in each cell. The combined map is then used in the objective function of the simulated annealing algorithm. The algorithm is run both using the proposed methodology and considering the ecosystem services individually. The results are analyzed with spatial statistics and landscape metrics to check the number of ecosystem services that the delimited areas produce, as well as their regularity and compactness. It has been observed that the proposed methodology increases the number of ecosystem services produced by delimited areas, improving their multifunctionality and increasing their effectiveness in preventing climate change impacts.Keywords: ecosystem services trade-offs, green infrastructure delineation, multifunctional buffer areas, climate change
Procedia PDF Downloads 182944 Deep Learning for Renewable Power Forecasting: An Approach Using LSTM Neural Networks
Authors: Fazıl Gökgöz, Fahrettin Filiz
Abstract:
Load forecasting has become crucial in recent years and become popular in forecasting area. Many different power forecasting models have been tried out for this purpose. Electricity load forecasting is necessary for energy policies, healthy and reliable grid systems. Effective power forecasting of renewable energy load leads the decision makers to minimize the costs of electric utilities and power plants. Forecasting tools are required that can be used to predict how much renewable energy can be utilized. The purpose of this study is to explore the effectiveness of LSTM-based neural networks for estimating renewable energy loads. In this study, we present models for predicting renewable energy loads based on deep neural networks, especially the Long Term Memory (LSTM) algorithms. Deep learning allows multiple layers of models to learn representation of data. LSTM algorithms are able to store information for long periods of time. Deep learning models have recently been used to forecast the renewable energy sources such as predicting wind and solar energy power. Historical load and weather information represent the most important variables for the inputs within the power forecasting models. The dataset contained power consumption measurements are gathered between January 2016 and December 2017 with one-hour resolution. Models use publicly available data from the Turkish Renewable Energy Resources Support Mechanism. Forecasting studies have been carried out with these data via deep neural networks approach including LSTM technique for Turkish electricity markets. 432 different models are created by changing layers cell count and dropout. The adaptive moment estimation (ADAM) algorithm is used for training as a gradient-based optimizer instead of SGD (stochastic gradient). ADAM performed better than SGD in terms of faster convergence and lower error rates. Models performance is compared according to MAE (Mean Absolute Error) and MSE (Mean Squared Error). Best five MAE results out of 432 tested models are 0.66, 0.74, 0.85 and 1.09. The forecasting performance of the proposed LSTM models gives successful results compared to literature searches.Keywords: deep learning, long short term memory, energy, renewable energy load forecasting
Procedia PDF Downloads 269943 Application of DSSAT-CSM Model for Estimating Rain-Water Productivity of Maize (Zea Mays L.) Under Changing Climate of Central Rift Valley, Ethiopia
Authors: Fitih Ademe, Kibebew Kibret, Sheleme Beyene, Mezgebu Getnet, Gashaw Meteke
Abstract:
Pressing demands for agricultural products and its associated pressure on water availability in the semi-arid areas demanded information for strategic decision-making in the changing climate conditions of Ethiopia. Availing such information through traditional agronomic research methods is not sufficient unless supported through the application of decision-support tools. The CERES (Crop Environmental Resource Synthesis) model in DSSAT-CSM was evaluated for estimating yield and water productivity of maize under two soil types (Andosol and Luvisol) of the Central Rift Valley of Ethiopia. A six-year data (2010 – 2017) obtained from national fertilizer determination experiments were used for model evaluation. Pertinent statistical indices were employed to evaluate model performance. Following model evaluation, yield and rain-water productivity of maize was assessed for the baseline (1981-2010) and future climate (2050’s and 2080’s) scenario. The model performed well in predicting phenology, growth, and yield of maize for the different seasons and phosphorous rates. A good agreement between simulated and observed grain yield was indicated by low values of the RMSE (0.15 - 0.37 Mg/ha) and other indices for the two soil types. The evaluated model predicted a decline in the potential (23.8 to 26.7% at Melkassa and from 21.7 to 26.1% at Ziway under RCP4.5 and RCP8.5 climate change scenarios, respectively) and water-limited yield (15 to 18.3% at Melkassa and by 6.5 to 10.5% at Ziway) in the mid-century due to climate change. Consequently, a decline in water productivity was projected in the future periods that necessitate availing options to improve water productivity in the region. In conclusion, the DSSAT-CERES-maize model can be used to simulate maize (Melkassa-2) phenology, growth and grain yield, as well as simulate water productivity under different management scenarios that can help to identify options to improve water productivity in the changing climate of the semi-arid central Rift valley of Ethiopia.Keywords: andosol, CERES-maize, luvisol, model evaluation, water productivity
Procedia PDF Downloads 79942 A Study of Smartphone Engagement Patterns of Millennial in India
Authors: Divyani Redhu, Manisha Rathaur
Abstract:
India has emerged as a very lucrative market for the smartphones in a very short span of time. The number of smartphone users here is growing massively with each passing day. Also, the expansion of internet services to far corners of the nation has also given a push to the smartphone revolution in India. Millennial, also known as Generation Y or the Net Generation is the generation born between the early 1980s and mid-1990s (some definitions extending further to early 2000s). Spanning roughly over 15 years, different social classes, cultures, and continents; it is irrational to imagine that millennial have a unified identity. But still, it cannot be denied that the growing millennial population is not only young but is highly tech-savvy too. It is not just the appearance of the device that today; we call it ‘smart’. Rather, it is the numerous tasks and functions that it can perform which has led its name to evolve as that of a ‘smartphone’. From usual tasks that were earlier performed by a simple mobile phone like making calls, sending messages, clicking photographs, recording videos etc.; today, the time has come where most of our day – to – day tasks are being taken care of by our all-time companion, i.e. smartphones. From being our alarm clock to being our note-maker, from our watch to our radio, our book-reader to our reminder, smartphones are present everywhere. Smartphone has now become an essential device for particularly the millennial to communicate not only with their friends but also with their family, colleagues, and teachers. The study by the researchers would be quantitative in nature. For the same, a survey would be conducted in particularly the capital of India, i.e. Delhi and the National Capital Region (NCR), which is the metropolitan area covering the entire National Capital Territory of Delhi and urban areas covering states of Haryana, Uttarakhand, Uttar Pradesh and Rajasthan. The tool of the survey would be a questionnaire and the number of respondents would be 200. The results derived from the study would primarily focus on the increasing reach of smartphones in India, smartphones as technological innovation and convergent tools, smartphone usage pattern of millennial in India, most used applications by the millennial, the average time spent by them, the impact of smartphones on the personal interactions of millennial etc. Thus, talking about the smartphone technology and the millennial in India, it would not be wrong to say that the growth, as well as the potential of the smartphones in India, is still immense. Also, very few technologies have made it possible to give a global exposure to the users and smartphone, if not the only one is certainly an immensely effective one that comes to the mind in this case.Keywords: Delhi – NCR, India, millennial, smartphone
Procedia PDF Downloads 144941 Smart Books as a Supporting Tool for Developing Skills of Designing and Employing Webquest 2.0
Authors: Huda Alyami
Abstract:
The present study aims to measure the effectiveness of an "Interactive eBook" in order to develop skills of designing and employing webquests for female intern teachers. The study uses descriptive analytical methodology as well as quasi-experimental methodology. The sample of the study consists of (30) female intern teachers from the Department of Special Education (in the tracks of Gifted Education and Learning Difficulties), during the first semester of the academic year 2015, at King Abdul-Aziz University in Jeddah city. The sample is divided into (15) female intern teachers for the experimental group, and (15) female intern teachers for the control group. A set of qualitative and quantitative tools have been prepared and verified for the study, embodied in: a list of the designing webquests' skills, a list of the employing webquests' skills, a webquests' knowledge achievement test, a product rating card, an observation card, and an interactive ebook. The study concludes the following results: 1. After pre-control, there are statistically significant differences, at the significance level of (α ≤ 0.05), between the mean scores of the experimental and the control groups in the post measurement of the webquests' knowledge achievement test, in favor of the experimental group. 2. There are statistically significant differences, at the significance level of (α ≤ 0.05), between the mean scores of experimental and control groups in the post measurement of the product rating card in favor of the experimental group. 3. There are statistically significant differences, at the significance level of (α ≤ 0.05), between the mean scores of experimental and control groups in the post measurement of the observation card for the experimental group. In the light of the previous findings, the study recommends the following: taking advantage of interactive ebooks when teaching all educational courses for various disciplines at the university level, creating educational participative platforms to share educational interactive ebooks for various disciplines at the local and regional levels. The study suggests conducting further qualitative studies on the effectiveness of interactive ebooks, in addition to conducting studies on the use of (Web 2.0) in webquests.Keywords: interactive eBook, webquest, design, employing, develop skills
Procedia PDF Downloads 185940 Process of the Emergence and Evolution of Socio-Cultural Ideas about the "Asian States" In the Context of the Development of US Cinema in 1941-1945
Authors: Selifontova Darya Yurievna
Abstract:
The study of the process of the emergence and evolution of socio-cultural ideas about the "Asian states" in the context of the development of US cinema in 1941-1945 will contribute both to the approbation of a new approach to the classical subject and will allow using the methodological tools of history, political science, philology, sociology for understanding modern military-political, historical, ideological, socio-cultural processes on a concrete example. This is especially important for understanding the process of constructing the image of the Japanese Empire in the USA. Assessments and images of China and Japan in World War II, created in American cinema, had an immediate impact on the media, public sentiment, and opinions. During the war, the US cinema created new myths and actively exploited old ones, combining them with traditional Hollywood cliches - all this served as a basis for creating the image of China and the Japanese Empire on the screen, which were necessary to solve many foreign policy and domestic political tasks related to the construction of two completely different, but at the same time, similar images of Asia (China and the Japanese Empire). In modern studies devoted to the history of wars, the study of the specifics of the information confrontation of the parties is in demand. A special role in this confrontation is played by propaganda through cinema, which uses images, historical symbols, and stable metaphors, the appeal to which can form a certain public reaction. Soviet documentaries of the war years are proof of this. The relevance of the topic is due to the fact that cinema as a means of propaganda was very popular and in demand during the Second World War. This period was the time of creation of real masterpieces in the field of propaganda films, in the documentary space of the cinema of 1941 – 1945. The traditions of depicting the Second World War were laid down. The study of the peculiarities of visualization and mythologization of the Second World War in Soviet cinema is the most important stage for studying the development of the specifics of propaganda methods since the methods and techniques of depicting the war formed in 1941-1945 are also significant at the present stage of the study of society.Keywords: asian countries, politics, sociology, domestic politics, USA, cinema
Procedia PDF Downloads 132939 The Significance of Picture Mining in the Fashion and Design as a New Research Method
Authors: Katsue Edo, Yu Hiroi
Abstract:
T Increasing attention has been paid to using pictures and photographs in research since the beginning of the 21th century in social sciences. Meanwhile we have been studying the usefulness of Picture mining, which is one of the new ways for a these picture using researches. Picture Mining is an explorative research analysis method that takes useful information from pictures, photographs and static or moving images. It is often compared with the methods of text mining. The Picture Mining concept includes observational research in the broad sense, because it also aims to analyze moving images (Ochihara and Edo 2013). In the recent literature, studies and reports using pictures are increasing due to the environmental changes. These are identified as technological and social changes (Edo et.al. 2013). Low price digital cameras and i-phones, high information transmission speed, low costs for information transferring and high performance and resolution of the cameras of mobile phones have changed the photographing behavior of people. Consequently, there is less resistance in taking and processing photographs for most of the people in the developing countries. In these studies, this method of collecting data from respondents is often called as ‘participant-generated photography’ or ‘respondent-generated visual imagery’, which focuses on the collection of data and its analysis (Pauwels 2011, Snyder 2012). But there are few systematical and conceptual studies that supports it significance of these methods. We have discussed in the recent years to conceptualize these picture using research methods and formalize theoretical findings (Edo et. al. 2014). We have identified the most efficient fields of Picture mining in the following areas inductively and in case studies; 1) Research in Consumer and Customer Lifestyles. 2) New Product Development. 3) Research in Fashion and Design. Though we have found that it will be useful in these fields and areas, we must verify these assumptions. In this study we will focus on the field of fashion and design, to determine whether picture mining methods are really reliable in this area. In order to do so we have conducted an empirical research of the respondents’ attitudes and behavior concerning pictures and photographs. We compared the attitudes and behavior of pictures toward fashion to meals, and found out that taking pictures of fashion is not as easy as taking meals and food. Respondents do not often take pictures of fashion and upload their pictures online, such as Facebook and Instagram, compared to meals and food because of the difficulty of taking them. We concluded that we should be more careful in analyzing pictures in the fashion area for there still might be some kind of bias existing even if the environment of pictures have drastically changed in these years.Keywords: empirical research, fashion and design, Picture Mining, qualitative research
Procedia PDF Downloads 365938 Macroscopic Support Structure Design for the Tool-Free Support Removal of Laser Powder Bed Fusion-Manufactured Parts Made of AlSi10Mg
Authors: Tobias Schmithuesen, Johannes Henrich Schleifenbaum
Abstract:
The additive manufacturing process laser powder bed fusion offers many advantages over conventional manufacturing processes. For example, almost any complex part can be produced, such as topologically optimized lightweight parts, which would be inconceivable with conventional manufacturing processes. A major challenge posed by the LPBF process, however, is, in most cases, the need to use and remove support structures on critically inclined part surfaces (α < 45 ° regarding substrate plate). These are mainly used for dimensionally accurate mapping of part contours and to reduce distortion by absorbing process-related internal stresses. Furthermore, they serve to transfer the process heat to the substrate plate and are, therefore, indispensable for the LPBF process. A major challenge for the economical use of the LPBF process in industrial process chains is currently still the high manual effort involved in removing support structures. According to the state of the art (SoA), the parts are usually treated by simple hand tools (e.g., pliers, chisels) or by machining (e.g., milling, turning). New automatable approaches are the removal of support structures by means of wet chemical ablation and thermal deburring. According to the state of the art, the support structures are essentially adapted to the LPBF process and not to potential post-processing steps. The aim of this study is the determination of support structure designs that are adapted to the mentioned post-processing approaches. In the first step, the essential boundary conditions for complete removal by means of the respective approaches are identified. Afterward, a representative demonstrator part with various macroscopic support structure designs will be LPBF-manufactured and tested with regard to a complete powder and support removability. Finally, based on the results, potentially suitable support structure designs for the respective approaches will be derived. The investigations are carried out on the example of the aluminum alloy AlSi10Mg.Keywords: additive manufacturing, laser powder bed fusion, laser beam melting, selective laser melting, post processing, tool-free, wet chemical ablation, thermal deburring, aluminum alloy, AlSi10Mg
Procedia PDF Downloads 95937 Emerging Technologies for Learning: In Need of a Pro-Active Educational Strategy
Authors: Pieter De Vries, Renate Klaassen, Maria Ioannides
Abstract:
This paper is about an explorative research into the use of emerging technologies for teaching and learning in higher engineering education. The assumption is that these technologies and applications, which are not yet widely adopted, will help to improve education and as such actively work on the ability to better deal with the mismatch of skills bothering our industries. Technologies such as 3D printing, the Internet of Things, Virtual Reality, and others, are in a dynamic state of development which makes it difficult to grasp the value for education. Also, the instruments in current educational research seem not appropriate to assess the value of such technologies. This explorative research aims to foster an approach to better deal with this new complexity. The need to find out is urgent, because these technologies will be dominantly present in the near future in all aspects of life, including education. The methodology used in this research comprised an inventory of emerging technologies and tools that potentially give way to innovation and are used or about to be used in technical universities. The inventory was based on both a literature review and a review of reports and web resources like blogs and others and included a series of interviews with stakeholders in engineering education and at representative industries. In addition, a number of small experiments were executed with the aim to analyze the requirements for the use of in this case Virtual Reality and the Internet of Things to better understanding the opportunities and limitations in the day-today learning environment. The major findings indicate that it is rather difficult to decide about the value of these technologies for education due to the dynamic state of change and therefor unpredictability and the lack of a coherent policy at the institutions. Most decisions are being made by teachers on an individual basis, who in their micro-environment are not equipped to select, test and ultimately decide about the use of these technologies. Most experiences are being made in the industry knowing that the skills to handle these technologies are in high demand. The industry though is worried about the inclination and the capability of education to help bridge the skills gap related to the emergence of new technologies. Due to the complexity, the diversity, the speed of development and the decay, education is challenged to develop an approach that can make these technologies work in an integrated fashion. For education to fully profit from the opportunities, these technologies offer it is eminent to develop a pro-active strategy and a sustainable approach to frame the emerging technologies development.Keywords: emerging technologies, internet of things, pro-active strategy, virtual reality
Procedia PDF Downloads 194936 Possibilities to Evaluate the Climatic and Meteorological Potential for Viticulture in Poland: The Case Study of the Jagiellonian University Vineyard
Authors: Oskar Sekowski
Abstract:
Current global warming causes changes in the traditional zones of viticulture worldwide. During 20th century, the average global air temperature increased by 0.89˚C. The models of climate change indicate that viticulture, currently concentrating in narrow geographic niches, may move towards the poles, to higher geographic latitudes. Global warming may cause changes in traditional viticulture regions. Therefore, there is a need to estimate the climatic conditions and climate change in areas that are not traditionally associated with viticulture, e.g., Poland. The primary objective of this paper is to prepare methodology to evaluate the climatic and meteorological potential for viticulture in Poland based on a case study. Moreover, the additional aim is to evaluate the climatic potential of a mesoregion where a university vineyard is located. The daily data of temperature, precipitation, insolation, and wind speed (1988-2018) from the meteorological station located in Łazy, southern Poland, was used to evaluate 15 climatological parameters and indices connected with viticulture. The next steps of the methodology are based on Geographic Information System methods. The topographical factors such as a slope gradient and slope exposure were created using Digital Elevation Models. The spatial distribution of climatological elements was interpolated by ordinary kriging. The values of each factor and indices were also ranked and classified. The viticultural potential was determined by integrating two suitability maps, i.e., the topographical and climatic ones, and by calculating the average for each pixel. Data analysis shows significant changes in heat accumulation indices that are driven by increases in maximum temperature, mostly increasing number of days with Tmax > 30˚C. The climatic conditions of this mesoregion are sufficient for vitis vinifera viticulture. The values of indicators and insolation are similar to those in the known wine regions located on similar geographical latitudes in Europe. The smallest threat to viticulture in study area is the occurrence of hail and the highest occurrence of frost in the winter. This research provides the basis for evaluating general suitability and climatologic potential for viticulture in Poland. To characterize the climatic potential for viticulture, it is necessary to assess the suitability of all climatological and topographical factors that can influence viticulture. The methodology used in this case study shows places where there is a possibility to create vineyards. It may also be helpful for wine-makers to select grape varieties.Keywords: climatologic potential, climatic classification, Poland, viticulture
Procedia PDF Downloads 109935 Application of GIS Techniques for Analysing Urban Built-Up Growth of Class-I Indian Cities: A Case Study of Surat
Authors: Purba Biswas, Priyanka Dey
Abstract:
Worldwide rapid urbanisation has accelerated city expansion in both developed and developing nations. This unprecedented urbanisation trend due to the increasing population and economic growth has caused challenges for the decision-makers in city planning and urban management. Metropolitan cities, class-I towns, and major urban centres undergo a continuous process of evolution due to interaction between socio-cultural and economic attributes. This constant evolution leads to urban expansion in all directions. Understanding the patterns and dynamics of urban built-up growth is crucial for policymakers, urban planners, and researchers, as it aids in resource management, decision-making, and the development of sustainable strategies to address the complexities associated with rapid urbanisation. Identifying spatio-temporal patterns of urban growth has emerged as a crucial challenge in monitoring and assessing present and future trends in urban development. Analysing urban growth patterns and tracking changes in land use is an important aspect of urban studies. This study analyses spatio-temporal urban transformations and land-use and land cover changes using remote sensing and GIS techniques. Built-up growth analysis has been done for the city of Surat as a case example, using the GIS tools of NDBI and GIS models of the Built-up Urban Density Index and Shannon Entropy Index to identify trends and the geographical direction of transformation from 2005 to 2020. Surat is one of the fastest-growing urban centres in both the state and the nation, ranking as the 4th fastest-growing city globally. This study analyses the dynamics of urban built-up area transformations both zone-wise and geographical direction-wise, in which their trend, rate, and magnitude were calculated for the period of 15 years. This study also highlights the need for analysing and monitoring the urban growth pattern of class-I cities in India using spatio-temporal and quantitative techniques like GIS for improved urban management.Keywords: urban expansion, built-up, geographic information system, remote sensing, Shannon’s entropy
Procedia PDF Downloads 78934 A Simulated Evaluation of Model Predictive Control
Authors: Ahmed AlNouss, Salim Ahmed
Abstract:
Process control refers to the techniques to control the variables in a process in order to maintain them at their desired values. Advanced process control (APC) is a broad term within the domain of control where it refers to different kinds of process control and control related tools, for example, model predictive control (MPC), statistical process control (SPC), fault detection and classification (FDC) and performance assessment. APC is often used for solving multivariable control problems and model predictive control (MPC) is one of only a few advanced control methods used successfully in industrial control applications. Advanced control is expected to bring many benefits to the plant operation; however, the extent of the benefits is plant specific and the application needs a large investment. This requires an analysis of the expected benefits before the implementation of the control. In a real plant simulation studies are carried out along with some experimentation to determine the improvement in the performance of the plant due to advanced control. In this research, such an exercise is undertaken to realize the needs of APC application. The main objectives of the paper are as follows: (1) To apply MPC to a number of simulations set up to realize the need of MPC by comparing its performance with that of proportional integral derivatives (PID) controllers. (2) To study the effect of controller parameters on control performance. (3) To develop appropriate performance index (PI) to compare the performance of different controller and develop novel idea to present tuning map of a controller. These objectives were achieved by applying PID controller and a special type of MPC which is dynamic matrix control (DMC) on the multi-tanks process simulated in loop-pro. Then the controller performance has been evaluated by changing the controller parameters. This performance was based on special indices related to the difference between set point and process variable in order to compare the both controllers. The same principle was applied for continuous stirred tank heater (CSTH) and continuous stirred tank reactor (CSTR) processes simulated in Matlab. However, in these processes some developed programs were written to evaluate the performance of the PID and MPC controllers. Finally these performance indices along with their controller parameters were plotted using special program called Sigmaplot. As a result, the improvement in the performance of the control loops was quantified using relevant indices to justify the need and importance of advanced process control. Also, it has been approved that, by using appropriate indices, predictive controller can improve the performance of the control loop significantly.Keywords: advanced process control (APC), control loop, model predictive control (MPC), proportional integral derivatives (PID), performance indices (PI)
Procedia PDF Downloads 410933 Bioleaching of Metals Contained in Spent Catalysts by Acidithiobacillus thiooxidans DSM 26636
Authors: Andrea M. Rivas-Castillo, Marlenne Gómez-Ramirez, Isela Rodríguez-Pozos, Norma G. Rojas-Avelizapa
Abstract:
Spent catalysts are considered as hazardous residues of major concern, mainly due to the simultaneous presence of several metals in elevated concentrations. Although hydrometallurgical, pyrometallurgical and chelating agent methods are available to remove and recover some metals contained in spent catalysts; these procedures generate potentially hazardous wastes and the emission of harmful gases. Thus, biotechnological treatments are currently gaining importance to avoid the negative impacts of chemical technologies. To this end, diverse microorganisms have been used to assess the removal of metals from spent catalysts, comprising bacteria, archaea and fungi, whose resistance and metal uptake capabilities differ depending on the microorganism tested. Acidophilic sulfur oxidizing bacteria have been used to investigate the biotreatment and extraction of valuable metals from spent catalysts, namely Acidithiobacillus thiooxidans and Acidithiobacillus ferroxidans, as they present the ability to produce leaching agents such as sulfuric acid and sulfur oxidation intermediates. In the present work, the ability of A. thiooxidans DSM 26636 for the bioleaching of metals contained in five different spent catalysts was assessed by growing the culture in modified Starkey mineral medium (with elemental sulfur at 1%, w/v), and 1% (w/v) pulp density of each residue for up to 21 days at 30 °C and 150 rpm. Sulfur-oxidizing activity was periodically evaluated by determining sulfate concentration in the supernatants according to the NMX-k-436-1977 method. The production of sulfuric acid was assessed in the supernatants as well, by a titration procedure using NaOH 0.5 M with bromothymol blue as acid-base indicator, and by measuring pH using a digital potentiometer. On the other hand, Inductively Coupled Plasma - Optical Emission Spectrometry was used to analyze metal removal from the five different spent catalysts by A. thiooxidans DSM 26636. Results obtained show that, as could be expected, sulfuric acid production is directly related to the diminish of pH, and also to highest metal removal efficiencies. It was observed that Al and Fe are recurrently removed from refinery spent catalysts regardless of their origin and previous usage, although these removals may vary from 9.5 ± 2.2 to 439 ± 3.9 mg/kg for Al, and from 7.13 ± 0.31 to 368.4 ± 47.8 mg/kg for Fe, depending on the spent catalyst proven. Besides, bioleaching of metals like Mg, Ni, and Si was also obtained from automotive spent catalysts, which removals were of up to 66 ± 2.2, 6.2±0.07, and 100±2.4, respectively. Hence, the data presented here exhibit the potential of A. thiooxidans DSM 26636 for the simultaneous bioleaching of metals contained in spent catalysts from diverse provenance.Keywords: bioleaching, metal removal, spent catalysts, Acidithiobacillus thiooxidans
Procedia PDF Downloads 144932 Exploration of Building Information Modelling Software to Develop Modular Coordination Design Tool for Architects
Authors: Muhammad Khairi bin Sulaiman
Abstract:
The utilization of Building Information Modelling (BIM) in the construction industry has provided an opportunity for designers in the Architecture, Engineering and Construction (AEC) industry to proceed from the conventional method of using manual drafting to a way that creates alternative designs quickly, produces more accurate, reliable and consistent outputs. By using BIM Software, designers can create digital content that manipulates the use of data using the parametric model of BIM. With BIM software, more alternative designs can be created quickly and design problems can be explored further to produce a better design faster than conventional design methods. Generally, BIM is used as a documentation mechanism and has not been fully explored and utilised its capabilities as a design tool. Relative to the current issue, Modular Coordination (MC) design as a sustainable design practice is encouraged since MC design will reduce material wastage through standard dimensioning, pre-fabrication, repetitive, modular construction and components. However, MC design involves a complex process of rules and dimensions. Therefore, a tool is needed to make this process easier. Since the parameters in BIM can easily be manipulated to follow MC rules and dimensioning, thus, the integration of BIM software with MC design is proposed for architects during the design stage. With this tool, there will be an improvement in acceptance and practice in the application of MC design effectively. Consequently, this study will analyse and explore the function and customization of BIM objects and the capability of BIM software to expedite the application of MC design during the design stage for architects. With this application, architects will be able to create building models and locate objects within reference modular grids that adhere to MC rules and dimensions. The parametric modeling capabilities of BIM will also act as a visual tool that will further enhance the automation of the 3-Dimensional space planning modeling process. (Method) The study will first analyze and explore the parametric modeling capabilities of rule-based BIM objects, which eventually customize a reference grid within the rules and dimensioning of MC. Eventually, the approach will further enhance the architect's overall design process and enable architects to automate complex modeling, which was nearly impossible before. A prototype using a residential quarter will be modeled. A set of reference grids guided by specific MC rules and dimensions will be used to develop a variety of space planning and configuration. With the use of the design, the tool will expedite the design process and encourage the use of MC Design in the construction industry.Keywords: building information modeling, modular coordination, space planning, customization, BIM application, MC space planning
Procedia PDF Downloads 87931 Waste Management in a Hot Laboratory of Japan Atomic Energy Agency – 1: Overview and Activities in Chemical Processing Facility
Authors: Kazunori Nomura, Hiromichi Ogi, Masaumi Nakahara, Sou Watanabe, Atsuhiro Shibata
Abstract:
Chemical Processing Facility of Japan Atomic Energy Agency is a basic research field for advanced back-end technology developments with using actual high-level radioactive materials such as irradiated fuels from the fast reactor, high-level liquid waste from reprocessing plant. In the nature of a research facility, various kinds of chemical reagents have been offered for fundamental tests. Most of them were treated properly and stored in the liquid waste vessel equipped in the facility, but some were not treated and remained at the experimental space as a kind of legacy waste. It is required to treat the waste in safety. On the other hand, we formulated the Medium- and Long-Term Management Plan of Japan Atomic Energy Agency Facilities. This comprehensive plan considers Chemical Processing Facility as one of the facilities to be decommissioned. Even if the plan is executed, treatment of the “legacy” waste beforehand must be a necessary step for decommissioning operation. Under this circumstance, we launched a collaborative research project called the STRAD project, which stands for Systematic Treatment of Radioactive liquid waste for Decommissioning, in order to develop the treatment processes for wastes of the nuclear research facility. In this project, decomposition methods of chemicals causing a troublesome phenomenon such as corrosion and explosion have been developed and there is a prospect of their decomposition in the facility by simple method. And solidification of aqueous or organic liquid wastes after the decomposition has been studied by adding cement or coagulants. Furthermore, we treated experimental tools of various materials with making an effort to stabilize and to compact them before the package into the waste container. It is expected to decrease the number of transportation of the solid waste and widen the operation space. Some achievements of these studies will be shown in this paper. The project is expected to contribute beneficial waste management outcome that can be shared world widely.Keywords: chemical processing facility, medium- and long-term management plan of JAEA facilities, STRAD project, treatment of radioactive waste
Procedia PDF Downloads 149930 Critical Evaluation of the Transformative Potential of Artificial Intelligence in Law: A Focus on the Judicial System
Authors: Abisha Isaac Mohanlal
Abstract:
Amidst all suspicions and cynicism raised by the legal fraternity, Artificial Intelligence has found its way into the legal system and has revolutionized the conventional forms of legal services delivery. Be it legal argumentation and research or resolution of complex legal disputes; artificial intelligence has crept into all legs of modern day legal services. Its impact has been largely felt by way of big data, legal expert systems, prediction tools, e-lawyering, automated mediation, etc., and lawyers around the world are forced to upgrade themselves and their firms to stay in line with the growth of technology in law. Researchers predict that the future of legal services would belong to artificial intelligence and that the age of human lawyers will soon rust. But as far as the Judiciary is concerned, even in the developed countries, the system has not fully drifted away from the orthodoxy of preferring Natural Intelligence over Artificial Intelligence. Since Judicial decision-making involves a lot of unstructured and rather unprecedented situations which have no single correct answer, and looming questions of legal interpretation arise in most of the cases, discretion and Emotional Intelligence play an unavoidable role. Added to that, there are several ethical, moral and policy issues to be confronted before permitting the intrusion of Artificial Intelligence into the judicial system. As of today, the human judge is the unrivalled master of most of the judicial systems around the globe. Yet, scientists of Artificial Intelligence claim that robot judges can replace human judges irrespective of how daunting the complexity of issues is and how sophisticated the cognitive competence required is. They go on to contend that even if the system is too rigid to allow robot judges to substitute human judges in the recent future, Artificial Intelligence may still aid in other judicial tasks such as drafting judicial documents, intelligent document assembly, case retrieval, etc., and also promote overall flexibility, efficiency, and accuracy in the disposal of cases. By deconstructing the major challenges that Artificial Intelligence has to overcome in order to successfully invade the human- dominated judicial sphere, and critically evaluating the potential differences it would make in the system of justice delivery, the author tries to argue that penetration of Artificial Intelligence into the Judiciary could surely be enhancive and reparative, if not fully transformative.Keywords: artificial intelligence, judicial decision making, judicial systems, legal services delivery
Procedia PDF Downloads 228929 The Spatial Classification of China near Sea for Marine Biodiversity Conservation Based on Bio-Geographical Factors
Abstract:
Global biodiversity continues to decline as a result of global climate change and various human activities, such as habitat destruction, pollution, introduction of alien species and overfishing. Although there are connections between global marine organisms more or less, it is better to have clear geographical boundaries in order to facilitate the assessment and management of different biogeographical zones. And so area based management tools (ABMT) are considered as the most effective means for the conservation and sustainable use of marine biodiversity. On a large scale, the geographical gap (or barrier) is the main factor to influence the connectivity, diffusion, ecological and evolutionary process of marine organisms, which results in different distribution patterns. On a small scale, these factors include geographical location, geology, and geomorphology, water depth, current, temperature, salinity, etc. Therefore, the analysis on geographic and environmental factors is of great significance in the study of biodiversity characteristics. This paper summarizes the marine spatial classification and ABMTs used in coastal area, open oceans and deep sea. And analysis principles and methods of marine spatial classification based on biogeographic related factors, and take China Near Sea (CNS) area as case study, and select key biogeographic related factors, carry out marine spatial classification at biological region scale, ecological regionals scale and biogeographical scale. The research shows that CNS is divided into 5 biological regions by climate and geographical differences, the Yellow Sea, the Bohai Sea, the East China Sea, the Taiwan Straits, and the South China Sea. And the bioregions are then divided into 12 ecological regions according to the typical ecological and administrative factors, and finally the eco-regions are divided into 98 biogeographical units according to the benthic substrate types, depth, coastal types, water temperature, and salinity, given the integrity of biological and ecological process, the area of the biogeographical units is not less than 1,000 km². This research is of great use to the coastal management and biodiversity conservation for local and central government, and provide important scientific support for future spatial planning and management of coastal waters and sustainable use of marine biodiversity.Keywords: spatial classification, marine biodiversity, bio-geographical, conservation
Procedia PDF Downloads 156928 Implications of Circular Economy on Users Data Privacy: A Case Study on Android Smartphones Second-Hand Market
Authors: Mariia Khramova, Sergio Martinez, Duc Nguyen
Abstract:
Modern electronic devices, particularly smartphones, are characterised by extremely high environmental footprint and short product lifecycle. Every year manufacturers release new models with even more superior performance, which pushes the customers towards new purchases. As a result, millions of devices are being accumulated in the urban mine. To tackle these challenges the concept of circular economy has been introduced to promote repair, reuse and recycle of electronics. In this case, electronic devices, that previously ended up in landfills or households, are getting the second life, therefore, reducing the demand for new raw materials. Smartphone reuse is gradually gaining wider adoption partly due to the price increase of flagship models, consequently, boosting circular economy implementation. However, along with reuse of communication device, circular economy approach needs to ensure the data of the previous user have not been 'reused' together with a device. This is especially important since modern smartphones are comparable with computers in terms of performance and amount of data stored. These data vary from pictures, videos, call logs to social security numbers, passport and credit card details, from personal information to corporate confidential data. To assess how well the data privacy requirements are followed on smartphones second-hand market, a sample of 100 Android smartphones has been purchased from IT Asset Disposition (ITAD) facilities responsible for data erasure and resell. Although devices should not have stored any user data by the time they leave ITAD, it has been possible to retrieve the data from 19% of the sample. Applied techniques varied from manual device inspection to sophisticated equipment and tools. These findings indicate significant barrier in implementation of circular economy and a limitation of smartphone reuse. Therefore, in order to motivate the users to donate or sell their old devices and make electronic use more sustainable, data privacy on second-hand smartphone market should be significantly improved. Presented research has been carried out in the framework of sustainablySMART project, which is part of Horizon 2020 EU Framework Programme for Research and Innovation.Keywords: android, circular economy, data privacy, second-hand phones
Procedia PDF Downloads 132927 A Method against Obsolescence of Three-Dimensional Archaeological Collection. Two Cases of Study from Qubbet El-Hawa Necropolis, Aswan, Egypt
Authors: L. Serrano-Lara, J.M Alba-Gómez
Abstract:
Qubbet el–Hawa Project has been documented archaeological artifacts as 3d models by laser scanning technique since 2015. Currently, research has obtained the right methodology to develop a high accuracy photographic texture for each geometrical 3D model. Furthermore, the right methodology to attach the complete digital surrogate into a 3DPDF document has been obtained; it is used as a catalogue worksheet that brings archaeological data and, at the same time, allows us to obtain precise measurements, volume calculations and cross-section mapping of each scanned artifact. This validated archaeological documentation is the first step for dissemination, application as Qubbet el-Hawa Virtual Museum, and, moreover, multi-sensory experience through 3D print archaeological artifacts. Material culture from four funerary complexes constructed in West Aswan has become physical replicas opening the archaeological research process itself and offering creative possibilities on museology or educational projects. This paper shares a method of acquiring texture for scanning´s output product in order to achieve a 3DPDF archaeological cataloguing, and, on the other hand, to allow the colorfully 3D printing of singular archaeological artifacts. The proposed method has undergone two concrete cases, a polychrome wooden ushabti, and, a cartonnage mask belonging to a lady, bought recovered on intact tomb QH34aa. Both 3D model results have been implemented on three main applications, archaeological 3D catalogue, public dissemination activities, and the 3D artifact model in a bachelor education program. Due to those three already mentioned applications, productive interaction among spectator and three-dimensional artifact have been increased; moreover, functionality as archaeological documentation has been consolidated. Finding the right methodology to assign a specific color to each vector on the geometric 3D model, we had been achieved two essential archaeological applications. Firstly, 3DPDF as a display document for an archaeological catalogue, secondly, the possibility to obtain a colored 3d printed object to be displayed in public exhibitions. Obsolescences 3D models have become updated archaeological documentation of QH43aa tomb cultural material. Therefore, Qubbet el-Hawa Project has been actualized the educational potential of its results thanks to a multi-sensory experience that arose from 3d scanned´s archaeological artifacts.Keywords: 3D printed, 3D scanner, Middle Kingdom, Qubbet el-Hawa necropolis, virtual archaeology
Procedia PDF Downloads 146926 Update on Epithelial Ovarian Cancer (EOC), Types, Origin, Molecular Pathogenesis, and Biomarkers
Authors: Salina Yahya Saddick
Abstract:
Ovarian cancer remains the most lethal gynecological malignancy due to the lack of highly sensitive and specific screening tools for detection of early-stage disease. The OSE provides the progenitor cells for 90% of human ovarian cancers. Recent morphologic, immunohistochemical and molecular genetic studies have led to the development of a new paradigm for the pathogenesis and origin of epithelial ovarian cancer (EOC) based on a ualistic model of carcinogenesis that divides EOC into two broad categories designated Types I and II which are characterized by specific mutations, including KRAS, BRAF, ERBB2, CTNNB1, PTEN PIK3CA, ARID1A, and PPPR1A, which target specific cell signaling pathways. Type 1 tumors rarely harbor TP53. type I tumors are relatively genetically stable and typically display a variety of somatic sequence mutations that include KRAS, BRAF, PTEN, PIK3CA CTNNB1 (the gene encoding beta catenin), ARID1A and PPP2R1A but very rarely TP53 . The cancer stem cell (CSC) hypothesis postulates that the tumorigenic potential of CSCs is confined to a very small subset of tumor cells and is defined by their ability to self-renew and differentiate leading to the formation of a tumor mass. Potential protein biomarker miRNA, are promising biomarkers as they are remarkably stable to allow isolation and analysis from tissues and from blood in which they can be found as free circulating nucleic acids and in mononuclear cells. Recently, genomic anaylsis have identified biomarkers and potential therapeutic targets for ovarian cancer namely, FGF18 which plays an active role in controlling migration, invasion, and tumorigenicity of ovarian cancer cells through NF-κB activation, which increased the production of oncogenic cytokines and chemokines. This review summarizes update information on epithelial ovarian cancers and point out to the most recent ongoing research.Keywords: epithelial ovarian cancers, somatic sequence mutations, cancer stem cell (CSC), potential protein, biomarker, genomic analysis, FGF18 biomarker
Procedia PDF Downloads 382