Search results for: well data integration
23956 Nonparametric Truncated Spline Regression Model on the Data of Human Development Index in Indonesia
Authors: Kornelius Ronald Demu, Dewi Retno Sari Saputro, Purnami Widyaningsih
Abstract:
Human Development Index (HDI) is a standard measurement for a country's human development. Several factors may have influenced it, such as life expectancy, gross domestic product (GDP) based on the province's annual expenditure, the number of poor people, and the percentage of an illiterate people. The scatter plot between HDI and the influenced factors show that the plot does not follow a specific pattern or form. Therefore, the HDI's data in Indonesia can be applied with a nonparametric regression model. The estimation of the regression curve in the nonparametric regression model is flexible because it follows the shape of the data pattern. One of the nonparametric regression's method is a truncated spline. Truncated spline regression is one of the nonparametric approach, which is a modification of the segmented polynomial functions. The estimator of a truncated spline regression model was affected by the selection of the optimal knots point. Knot points is a focus point of spline truncated functions. The optimal knots point was determined by the minimum value of generalized cross validation (GCV). In this article were applied the data of Human Development Index with a truncated spline nonparametric regression model. The results of this research were obtained the best-truncated spline regression model to the HDI's data in Indonesia with the combination of optimal knots point 5-5-5-4. Life expectancy and the percentage of an illiterate people were the significant factors depend to the HDI in Indonesia. The coefficient of determination is 94.54%. This means the regression model is good enough to applied on the data of HDI in Indonesia.Keywords: generalized cross validation (GCV), Human Development Index (HDI), knots point, nonparametric regression, truncated spline
Procedia PDF Downloads 33923955 Impact of Protean Career Attitude on Career Success with the Mediating Effect of Career Insight
Authors: Prabhashini Wijewantha
Abstract:
This study looks at the impact of protean career attitude of employees on their career success and next it looks at the mediation effect of career insights on the above relationship. Career success is defined as the accomplishment of desirable work related outcomes at any point in person’s work experiences over time and it comprises of two sub variables, namely, career satisfaction and perceived employability. Protean career attitude was measured using the eight items from the Self Directedness subscale of the Protean Career Attitude scale developed by Briscoe and Hall, where as career satisfaction was measured by the three item scale developed by Martine, Eddleston, and Veiga. Perceived employability was also evaluated using three items and career insight was measured using fourteen items that were adapted and used by De Vos and Soens. Data were collected from a sample of 300 mid career executives in Sri Lanka deploying the survey strategy and data were analyzed using the SPSS and AMOS software version 20.0. A preliminary analysis of data was initially performed where data were screened and reliability and validity were ensured. Next a simple regression analysis was performed to test the direct impact of protean career attitude on career success and the hypothesis was supported. The Baron and Kenney’s four steps, three regressions approach for mediator testing was used to calculate the mediation effect of career insight on the above relationship and a partial mediation was supported by the data. Finally theoretical and practical implications are discussed.Keywords: career success, career insight, mid career MBAs, protean career attitude
Procedia PDF Downloads 36023954 Optimal Allocation of Distributed Generation Sources for Loss Reduction and Voltage Profile Improvement by Using Particle Swarm Optimization
Authors: Muhammad Zaheer Babar, Amer Kashif, Muhammad Rizwan Javed
Abstract:
Nowadays distributed generation integration is best way to overcome the increasing load demand. Optimal allocation of distributed generation plays a vital role in reducing system losses and improves voltage profile. In this paper, a Meta heuristic technique is proposed for allocation of DG in order to reduce power losses and improve voltage profile. The proposed technique is based on Multi Objective Particle Swarm optimization. Fewer control parameters are needed in this algorithm. Modification is made in search space of PSO. The effectiveness of proposed technique is tested on IEEE 33 bus test system. Single DG as well as multiple DG scenario is adopted for proposed method. Proposed method is more effective as compared to other Meta heuristic techniques and gives better results regarding system losses and voltage profile.Keywords: Distributed generation (DG), Multi Objective Particle Swarm Optimization (MOPSO), particle swarm optimization (PSO), IEEE standard Test System
Procedia PDF Downloads 45423953 An Analysis of Oil Price Changes and Other Factors Affecting Iranian Food Basket: A Panel Data Method
Authors: Niloofar Ashktorab, Negar Ashktorab
Abstract:
Oil exports fund nearly half of Iran’s government expenditures, since many years other countries have been imposed different sanctions against Iran. Sanctions that primarily target Iran’s key energy sector have harmed Iran’s economy. The strategic effects of sanctions might be reduction as Iran adjusts to them economically. In this study, we evaluate the impact of oil price and sanctions against Iran on food commodity prices by using panel data method. Here, we find that the food commodity prices, the oil price and real exchange rate are stationary. The results show positive effect of oil price changes, real exchange rate and sanctions on food commodity prices.Keywords: oil price, food basket, sanctions, panel data, Iran
Procedia PDF Downloads 35623952 A Proposed Framework for Software Redocumentation Using Distributed Data Processing Techniques and Ontology
Authors: Laila Khaled Almawaldi, Hiew Khai Hang, Sugumaran A. l. Nallusamy
Abstract:
Legacy systems are crucial for organizations, but their intricacy and lack of documentation pose challenges for maintenance and enhancement. Redocumentation of legacy systems is vital for automatically or semi-automatically creating documentation for software lacking sufficient records. It aims to enhance system understandability, maintainability, and knowledge transfer. However, existing redocumentation methods need improvement in data processing performance and document generation efficiency. This stems from the necessity to efficiently handle the extensive and complex code of legacy systems. This paper proposes a method for semi-automatic legacy system re-documentation using semantic parallel processing and ontology. Leveraging parallel processing and ontology addresses current challenges by distributing the workload and creating documentation with logically interconnected data. The paper outlines challenges in legacy system redocumentation and suggests a method of redocumentation using parallel processing and ontology for improved efficiency and effectiveness.Keywords: legacy systems, redocumentation, big data analysis, parallel processing
Procedia PDF Downloads 4623951 An Investigation of Surface Water Quality in an Industrial Area Using Integrated Approaches
Authors: Priti Saha, Biswajit Paul
Abstract:
Rapid urbanization and industrialization has increased the pollution load in surface water bodies. However, these water bodies are major source of water for drinking, irrigation, industrial activities and fishery. Therefore, water quality assessment is paramount importance to evaluate its suitability for all these purposes. This study focus to evaluate the surface water quality of an industrial city in eastern India through integrating interdisciplinary techniques. The multi-purpose Water Quality Index (WQI) assess the suitability for drinking, irrigation as well as fishery of forty-eight sampling locations, where 8.33% have excellent water quality (WQI:0-25) for fishery and 10.42%, 20.83% and 45.83% have good quality (WQI:25-50), which represents its suitability for drinking irrigation and fishery respectively. However, the industrial water quality was assessed through Ryznar Stability Index (LSI), which affirmed that only 6.25% of sampling locations have neither corrosive nor scale forming properties (RSI: 6.2-6.8). Integration of these statistical analysis with geographical information system (GIS) helps in spatial assessment. It identifies of the regions where the water quality is suitable for its use in drinking, irrigation, fishery as well as industrial activities. This research demonstrates the effectiveness of statistical and GIS techniques for water quality assessment.Keywords: surface water, water quality assessment, water quality index, spatial assessment
Procedia PDF Downloads 18023950 Knowledge Management Best Practice Model in Higher Learning Institution: A Systematic Literature Review
Authors: Ismail Halijah, Abdullah Rusli
Abstract:
Introduction: This systematic literature review aims to identify the Knowledge Management Best Practice components in the Knowledge Management Model for Higher Learning Institutions environment. Study design: Systematic literature review. Methods: A systematic literature re-view of Knowledge Management Best Practice to identify and define the components of Best Practice from the Knowledge Management models was conducted recently. Results: This review of published papers of conference and journals’ articles shows the components of Best Practice in Knowledge Management are basically divided into two aspect which is the soft aspect and the hard aspect. The lacks of combination of these two aspects into an integrated model decelerate Knowledge Management Best Practice to fully throttle. Evidence from the literature shows the lack of integration of this two aspects leads to the immaturity of the Higher Learning Institution (HLI) towards the implementation of Knowledge Management System. Conclusion: The first steps of identifying the attributes to measure the Knowledge Management Best Practice components from the models in the literature will led to the definition of the Knowledge Management Best Practice component for the higher learning environment.Keywords: knowledge management, knowledge management system, knowledge management best practice, knowledge management higher learning institution
Procedia PDF Downloads 59223949 Armenian Refugees in Early 20th C Japan: Quantitative Analysis on Their Number Based on Japanese Historical Data with the Comparison of a Foreign Historical Data
Authors: Meline Mesropyan
Abstract:
At the beginning of the 20th century, Japan served as a transit point for Armenian refugees fleeing the 1915 Genocide. However, research on Armenian refugees in Japan is sparse, and the Armenian Diaspora has never taken root in Japan. Consequently, Japan has not been considered a relevant research site for studying Armenian refugees. The primary objective of this study is to shed light on the number of Armenian refugees who passed through Japan between 1915 and 1930. Quantitative analyses will be conducted based on newly uncovered Japanese archival documents. Subsequently, the Japanese data will be compared to American immigration data to estimate the potential number of refugees in Japan during that period. This under-researched area is relevant to both the Armenian Diaspora and refugee studies in Japan. By clarifying the number of refugees, this study aims to enhance understanding of Japan's treatment of refugees and the extent of humanitarian efforts conducted by organizations and individuals in Japan, contributing to the broader field of historical refugee studies.Keywords: Armenian genocide, Armenian refugees, Japanese statistics, number of refugees
Procedia PDF Downloads 5723948 Building Green Infrastructure Networks Based on Cadastral Parcels Using Network Analysis
Authors: Gon Park
Abstract:
Seoul in South Korea established the 2030 Seoul City Master Plan that contains green-link projects to connect critical green areas within the city. However, the plan does not have detailed analyses for green infrastructure to incorporate land-cover information to many structural classes. This study maps green infrastructure networks of Seoul for complementing their green plans with identifying and raking green areas. Hubs and links of main elements of green infrastructure have been identified from incorporating cadastral data of 967,502 parcels to 135 of land use maps using geographic information system. Network analyses were used to rank hubs and links of a green infrastructure map with applying a force-directed algorithm, weighted values, and binary relationships that has metrics of density, distance, and centrality. The results indicate that network analyses using cadastral parcel data can be used as the framework to identify and rank hubs, links, and networks for the green infrastructure planning under a variable scenarios of green areas in cities.Keywords: cadastral data, green Infrastructure, network analysis, parcel data
Procedia PDF Downloads 20623947 Classification of Land Cover Usage from Satellite Images Using Deep Learning Algorithms
Authors: Shaik Ayesha Fathima, Shaik Noor Jahan, Duvvada Rajeswara Rao
Abstract:
Earth's environment and its evolution can be seen through satellite images in near real-time. Through satellite imagery, remote sensing data provide crucial information that can be used for a variety of applications, including image fusion, change detection, land cover classification, agriculture, mining, disaster mitigation, and monitoring climate change. The objective of this project is to propose a method for classifying satellite images according to multiple predefined land cover classes. The proposed approach involves collecting data in image format. The data is then pre-processed using data pre-processing techniques. The processed data is fed into the proposed algorithm and the obtained result is analyzed. Some of the algorithms used in satellite imagery classification are U-Net, Random Forest, Deep Labv3, CNN, ANN, Resnet etc. In this project, we are using the DeepLabv3 (Atrous convolution) algorithm for land cover classification. The dataset used is the deep globe land cover classification dataset. DeepLabv3 is a semantic segmentation system that uses atrous convolution to capture multi-scale context by adopting multiple atrous rates in cascade or in parallel to determine the scale of segments.Keywords: area calculation, atrous convolution, deep globe land cover classification, deepLabv3, land cover classification, resnet 50
Procedia PDF Downloads 13923946 The Effect of CPU Location in Total Immersion of Microelectronics
Authors: A. Almaneea, N. Kapur, J. L. Summers, H. M. Thompson
Abstract:
Meeting the growth in demand for digital services such as social media, telecommunications, and business and cloud services requires large scale data centres, which has led to an increase in their end use energy demand. Generally, over 30% of data centre power is consumed by the necessary cooling overhead. Thus energy can be reduced by improving the cooling efficiency. Air and liquid can both be used as cooling media for the data centre. Traditional data centre cooling systems use air, however liquid is recognised as a promising method that can handle the more densely packed data centres. Liquid cooling can be classified into three methods; rack heat exchanger, on-chip heat exchanger and full immersion of the microelectronics. This study quantifies the improvements of heat transfer specifically for the case of immersed microelectronics by varying the CPU and heat sink location. Immersion of the server is achieved by filling the gap between the microelectronics and a water jacket with a dielectric liquid which convects the heat from the CPU to the water jacket on the opposite side. Heat transfer is governed by two physical mechanisms, which is natural convection for the fixed enclosure filled with dielectric liquid and forced convection for the water that is pumped through the water jacket. The model in this study is validated with published numerical and experimental work and shows good agreement with previous work. The results show that the heat transfer performance and Nusselt number (Nu) is improved by 89% by placing the CPU and heat sink on the bottom of the microelectronics enclosure.Keywords: CPU location, data centre cooling, heat sink in enclosures, immersed microelectronics, turbulent natural convection in enclosures
Procedia PDF Downloads 27223945 A Macroeconomic Analysis of Defense Industry: Comparisons, Trends and Improvements in Brazil and in the World
Authors: J. Fajardo, J. Guerra, E. Gonzales
Abstract:
This paper will outline a study of Brazil's industrial base of defense (IDB), through a bibliographic research method, combined with an analysis of macroeconomic data from several available public data platforms. This paper begins with a brief study about Brazilian national industry, including analyzes of productivity, income, outcome and jobs. Next, the research presents a study on the defense industry in Brazil, presenting the main national companies that operate in the aeronautical, army and naval branches. After knowing the main points of the Brazilian defense industry, data on the productivity of the defense industry of the main countries and competing companies of the Brazilian industry were analyzed, in order to summarize big cases in Brazil with a comparative analysis. Concerned the methodology, were used bibliographic research and the exploration of historical data series, in order to analyze information, to get trends and to make comparisons along the time. The research is finished with the main trends for the development of the Brazilian defense industry, comparing the current situation with the point of view of several countries.Keywords: economics of defence, industry, trends, market
Procedia PDF Downloads 15523944 Delineating Subsurface Linear Features and Faults Under Sedimentary Cover in the Bahira Basin Using Integrated Gravity and Magnetic Data
Authors: M. Lghoul, N. El Goumi, M. Guernouche
Abstract:
In order to predict the structural and tectonic framework of the Bahira basin and to have a 3D geological modeling of the basin, an integrated multidisciplinary work has been conducted using gravity, magnetic and geological data. The objective of the current study is delineating the subsurfacefeatures, faults, and geological limits, using airborne magnetic and gravity data analysis of the Bahira basin. To achieve our goal, we have applied different enhanced techniques on magnetic and gravity data: power spectral analysis techniques, reduction to pole (RTP), upward continuation, analytical signal, tilt derivative, total horizontal derivative, 3D Euler deconvolutionand source parameter imagining. The major lineaments/faults trend are: NE–SW, NW-SE, ENE–WSW, and WNW–ESE. The 3D Euler deconvolution analysis highlighted a number of fault trend, mainly in the ENE-WSW, WNW-ESE directions. The depth tothe top of the basement sources in the study area ranges between 200 m, in the southern and northern part of the Bahira basin, to 5000 m located in the Eastern part of the basin.Keywords: magnetic, gravity, structural trend, depth to basement
Procedia PDF Downloads 13223943 Copyright Clearance for Artificial Intelligence Training Data: Challenges and Solutions
Authors: Erva Akin
Abstract:
– The use of copyrighted material for machine learning purposes is a challenging issue in the field of artificial intelligence (AI). While machine learning algorithms require large amounts of data to train and improve their accuracy and creativity, the use of copyrighted material without permission from the authors may infringe on their intellectual property rights. In order to overcome copyright legal hurdle against the data sharing, access and re-use of data, the use of copyrighted material for machine learning purposes may be considered permissible under certain circumstances. For example, if the copyright holder has given permission to use the data through a licensing agreement, then the use for machine learning purposes may be lawful. It is also argued that copying for non-expressive purposes that do not involve conveying expressive elements to the public, such as automated data extraction, should not be seen as infringing. The focus of such ‘copy-reliant technologies’ is on understanding language rules, styles, and syntax and no creative ideas are being used. However, the non-expressive use defense is within the framework of the fair use doctrine, which allows the use of copyrighted material for research or educational purposes. The questions arise because the fair use doctrine is not available in EU law, instead, the InfoSoc Directive provides for a rigid system of exclusive rights with a list of exceptions and limitations. One could only argue that non-expressive uses of copyrighted material for machine learning purposes do not constitute a ‘reproduction’ in the first place. Nevertheless, the use of machine learning with copyrighted material is difficult because EU copyright law applies to the mere use of the works. Two solutions can be proposed to address the problem of copyright clearance for AI training data. The first is to introduce a broad exception for text and data mining, either mandatorily or for commercial and scientific purposes, or to permit the reproduction of works for non-expressive purposes. The second is that copyright laws should permit the reproduction of works for non-expressive purposes, which opens the door to discussions regarding the transposition of the fair use principle from the US into EU law. Both solutions aim to provide more space for AI developers to operate and encourage greater freedom, which could lead to more rapid innovation in the field. The Data Governance Act presents a significant opportunity to advance these debates. Finally, issues concerning the balance of general public interests and legitimate private interests in machine learning training data must be addressed. In my opinion, it is crucial that robot-creation output should fall into the public domain. Machines depend on human creativity, innovation, and expression. To encourage technological advancement and innovation, freedom of expression and business operation must be prioritised.Keywords: artificial intelligence, copyright, data governance, machine learning
Procedia PDF Downloads 8323942 Learning from Dendrites: Improving the Point Neuron Model
Authors: Alexander Vandesompele, Joni Dambre
Abstract:
The diversity in dendritic arborization, as first illustrated by Santiago Ramon y Cajal, has always suggested a role for dendrites in the functionality of neurons. In the past decades, thanks to new recording techniques and optical stimulation methods, it has become clear that dendrites are not merely passive electrical components. They are observed to integrate inputs in a non-linear fashion and actively participate in computations. Regardless, in simulations of neural networks dendritic structure and functionality are often overlooked. Especially in a machine learning context, when designing artificial neural networks, point neuron models such as the leaky-integrate-and-fire (LIF) model are dominant. These models mimic the integration of inputs at the neuron soma, and ignore the existence of dendrites. In this work, the LIF point neuron model is extended with a simple form of dendritic computation. This gives the LIF neuron increased capacity to discriminate spatiotemporal input sequences, a dendritic functionality as observed in another study. Simulations of the spiking neurons are performed using the Bindsnet framework. In the common LIF model, incoming synapses are independent. Here, we introduce a dependency between incoming synapses such that the post-synaptic impact of a spike is not only determined by the weight of the synapse, but also by the activity of other synapses. This is a form of short term plasticity where synapses are potentiated or depressed by the preceding activity of neighbouring synapses. This is a straightforward way to prevent inputs from simply summing linearly at the soma. To implement this, each pair of synapses on a neuron is assigned a variable,representing the synaptic relation. This variable determines the magnitude ofthe short term plasticity. These variables can be chosen randomly or, more interestingly, can be learned using a form of Hebbian learning. We use Spike-Time-Dependent-Plasticity (STDP), commonly used to learn synaptic strength magnitudes. If all neurons in a layer receive the same input, they tend to learn the same through STDP. Adding inhibitory connections between the neurons creates a winner-take-all (WTA) network. This causes the different neurons to learn different input sequences. To illustrate the impact of the proposed dendritic mechanism, even without learning, we attach five input neurons to two output neurons. One output neuron isa regular LIF neuron, the other output neuron is a LIF neuron with dendritic relationships. Then, the five input neurons are allowed to fire in a particular order. The membrane potentials are reset and subsequently the five input neurons are fired in the reversed order. As the regular LIF neuron linearly integrates its inputs at the soma, the membrane potential response to both sequences is similar in magnitude. In the other output neuron, due to the dendritic mechanism, the membrane potential response is different for both sequences. Hence, the dendritic mechanism improves the neuron’s capacity for discriminating spa-tiotemporal sequences. Dendritic computations improve LIF neurons even if the relationships between synapses are established randomly. Ideally however, a learning rule is used to improve the dendritic relationships based on input data. It is possible to learn synaptic strength with STDP, to make a neuron more sensitive to its input. Similarly, it is possible to learn dendritic relationships with STDP, to make the neuron more sensitive to spatiotemporal input sequences. Feeding structured data to a WTA network with dendritic computation leads to a significantly higher number of discriminated input patterns. Without the dendritic computation, output neurons are less specific and may, for instance, be activated by a sequence in reverse order.Keywords: dendritic computation, spiking neural networks, point neuron model
Procedia PDF Downloads 13323941 Potency of Minapolitan Area Development to Enhance Gross Domestic Product and Prosperty in Indonesia
Authors: Shobrina Silmi Qori Tarlita, Fariz Kukuh Harwinda
Abstract:
Indonesia has 81.000 kilometers coastal line and 70% water surface which is known as the country who has a huge potential in fisheries sector and also which is able to support more than 50 % of Gross Domestic Product. But according to Department of Marine and Fisheries data, fisheries sector supported only 20% of Total GDP in 1998. Not only that, the highest decline in fisheries sector income occured in 2009. Those conditions occur, because of some factors contributed to the lack of integrated working platform for the fisheries and marine management in some areas which have a high productivity to increase the economical profit every year for the country, especially Indonesia, besides the labor requirement for every company, whether a big company or smaller one, depends on the natural condition that makes a lot of people become unemployed if the weather condition or any other conditions dealing with the natural condition is bad for creating fisheries and marine management, especially in aquaculture and fish – captured operation. Not only those, a lot of fishermen, especially in Indonesia, mostly make their job profession as an additional job or side job to fulfill their own needs, although they are averagely poor. Another major problem are the lack of the sustainable developmental program to stabilize the productivity of fisheries and marine natural source, like protecting the environment for fish nursery ground and migration channel, that makes the low productivity of fisheries and marine natural resource, even though the growth of the society in Indonesia has increased for years and needs more food resource to comply the high demand nutrition for living. The development of Minapolitan Area is one of the alternative solution to build a better place for aqua-culturist as well as the fishermen which focusing on systemic and business effort for fisheries and marine management. Minapolitan is kind of integration area which gathers and integrates the ones who is focusing their effort and business in fisheries sector, so that Minapolitan is capable of triggering the fishery activity on the area which using Minapolitan management intensively. From those things, finally, Minapolitan is expected to reinforce the sustainable development through increasing the productivity of fish – capturing operation as well as aquaculture, and it is also expected that Minapolitan will be able to increase GDP, the earning for a lot of people and also will be able to bring prosperity around the world. From those backgrounds, this paper will explain more about the Minapolitan Area and the design of reinforcing the Minapolitan Area by zonation in the Fishery and Marine exploitation area with high productivity as well as low productivity. Hopefully, this solution will be able to answer the economical and social issue for declining food resource, especially fishery and marine resource.Keywords: Minapolitan, fisheries, economy, Indonesia
Procedia PDF Downloads 46323940 The BETA Module in Action: An Empirical Study on Enhancing Entrepreneurial Skills through Kearney's and Bloom's Guiding Principles
Authors: Yen Yen Tan, Lynn Lam, Cynthia Lam, Angela Koh, Edwin Seng
Abstract:
Entrepreneurial education plays a crucial role in nurturing future innovators and change-makers. Over time, significant progress has been made in refining instructional approaches to develop the necessary skills among learners effectively. Two highly valuable frameworks, Kearney's "4 Principles of Entrepreneurial Pedagogy" and Bloom's "Three Domains of Learning," serve as guiding principles in entrepreneurial education. Kearney's principles align with experiential and student-centric learning, which are crucial for cultivating an entrepreneurial mindset. The potential synergies between these frameworks hold great promise for enhancing entrepreneurial acumen among students. However, despite this potential, their integration remains largely unexplored. This study aims to bridge this gap by building upon the Business Essentials through Action (BETA) module and investigating its contributions to nurturing the entrepreneurial mindset. This study employs a quasi-experimental mixed-methods approach, combining quantitative and qualitative elements to ensure comprehensive and insightful data. A cohort of 235 students participated, with 118 enrolled in the BETA module and 117 in a traditional curriculum. Their Personal Entrepreneurial Competencies (PECs) were assessed before admission (pre-Y1) and one year into the course (post-Y1) using a comprehensive 55-item PEC questionnaire, enabling measurement of critical traits such as opportunity-seeking, persistence, and risk-taking. Rigorous computations of individual entrepreneurial competencies and overall PEC scores were performed, including a correction factor to mitigate potential self-assessment bias. The orchestration of Kearney's principles and Bloom's domains within the BETA module necessitates a granular examination. Here, qualitative revelations surface, courtesy of structured interviews aligned with contemporary research methodologies. These interviews act as a portal, ushering us into the transformative journey undertaken by students. Meanwhile, the study pivots to explore the BETA module's influence on students' entrepreneurial competencies from the vantage point of faculty members. A symphony of insights emanates from intimate focus group discussions featuring six dedicated lecturers, who share their perceptions, experiences, and reflective narratives, illuminating the profound impact of pedagogical practices embedded within the BETA module. Preliminary findings from ongoing data analysis indicate promising results, showcasing a substantial improvement in entrepreneurial skills among students participating in the BETA module. This study promises not only to elevate students' entrepreneurial competencies but also to illuminate the broader canvas of applicability for Kearney's principles and Bloom's domains. The dynamic interplay of quantitative analyses, proffering precise competency metrics, and qualitative revelations, delving into the nuanced narratives of transformative journeys, engenders a holistic understanding of this educational endeavour. Through a rigorous quasi-experimental mixed-methods approach, this research aims to establish the BETA module's effectiveness in fostering entrepreneurial acumen among students at Singapore Polytechnic, thereby contributing valuable insights to the broader discourse on educational methodologies.Keywords: entrepreneurial education, experiential learning, pedagogical frameworks, innovative competencies
Procedia PDF Downloads 6423939 Biosorption of Phenol onto Water Hyacinth Activated Carbon: Kinetics and Isotherm Study
Authors: Manoj Kumar Mahapatra, Arvind Kumar
Abstract:
Batch adsorption experiments were carried out for the removal of phenol from its aqueous solution using water hyancith activated carbon (WHAC) as an adsorbent. The sorption kinetics were analysed using pseudo-first order kinetics and pseudo-second order model, and it was observed that the sorption data tend to fit very well in pseudo-second order model for the entire sorption time. The experimental data were analyzed by the Langmuir and Freundlich isotherm models. Equilibrium data fitted well to the Freundlich model with a maximum biosorption capacity of 31.45 mg/g estimated using Langmuir model. The adsorption intensity 3.7975 represents a favorable adsorption condition.Keywords: adsorption, isotherm, kinetics, phenol
Procedia PDF Downloads 44623938 Strategies by a Teaching Assistant to Support the Classroom Talk of a Child with Communication and Interaction Difficulties in Italy: A Case for Promoting Social Scaffolding Training
Authors: Lorenzo Ciletti, Ed Baines, Matt Somerville
Abstract:
Internationally, supporting staff with limited training (Teaching Assistants (TA)) has played a critical role in the education of children with special educational needs and/or disabilities (SEND). Researchers have notably illustrated that TAs support the children’s classroom tasks while teachers manage the whole class. Rarely have researchers investigated the TAs’ support for children’s participation in whole-class or peer-group talk, despite this type of “social support” playing a significant role in children’s whole-class integration and engagement with the classroom curriculum and learning. Social support seems particularly crucial for a large proportion of children with SEND, namely those with communication and interaction difficulties (e.g., autism spectrum conditions and speech impairments). This study explored TA practice and, particularly, TA social support in a rarely examined context (Italy). The Italian case was also selected as it provides TAs, known nationally as “support teachers,” with the most comprehensive training worldwide, thus potentially echoing (effective) nuanced practice internationally. Twelve hours of video recordings of a single TA and a child with communication and interaction difficulties (CID) were made. Video data was converted into frequencies of TA multidimensional support strategies, including TA social support and pedagogical assistance. TA-pupil talk oriented to children’s participation in classroom talk was also analysed into thematic patterns. These multi-method analyses were informed by social scaffolding principles: in particular, the extent to which the TA designs instruction contingently to the child’s communication and interaction difficulties and how their social support fosters the child’s highest responsibility in dealing with whole-class or peer-group talk by supplying the least help. The findings showed that the TA rarely supported the group or whole class participation of the child with CID. When doing so, the TA seemed to highly control the content and the timing of the child’s contributions to the classroom talk by a) interrupting the teacher’s whole class or group conversation to start an interaction between themselves and the child and b) reassuring the child about the correctness of their talk in private conversations and prompting them to raise their hand and intervene in the whole-class talk or c) stopping the child from contributing to the whole-class or peer-group talk when incorrect. The findings are interpreted in terms of their theoretical relation to scaffolding. They have significant implications for promoting social scaffolding in TA training in Italy and elsewhere.Keywords: children with communication and interaction difficulties, children with special educational needs and/or disabilities, social scaffolding, teaching assistants, teaching practice, whole-class talk participation
Procedia PDF Downloads 9723937 A West Coast Estuarine Case Study: A Predictive Approach to Monitor Estuarine Eutrophication
Authors: Vedant Janapaty
Abstract:
Estuaries are wetlands where fresh water from streams mixes with salt water from the sea. Also known as “kidneys of our planet”- they are extremely productive environments that filter pollutants, absorb floods from sea level rise, and shelter a unique ecosystem. However, eutrophication and loss of native species are ailing our wetlands. There is a lack of uniform data collection and sparse research on correlations between satellite data and in situ measurements. Remote sensing (RS) has shown great promise in environmental monitoring. This project attempts to use satellite data and correlate metrics with in situ observations collected at five estuaries. Images for satellite data were processed to calculate 7 bands (SIs) using Python. Average SI values were calculated per month for 23 years. Publicly available data from 6 sites at ELK was used to obtain 10 parameters (OPs). Average OP values were calculated per month for 23 years. Linear correlations between the 7 SIs and 10 OPs were made and found to be inadequate (correlation = 1 to 64%). Fourier transform analysis on 7 SIs was performed. Dominant frequencies and amplitudes were extracted for 7 SIs, and a machine learning(ML) model was trained, validated, and tested for 10 OPs. Better correlations were observed between SIs and OPs, with certain time delays (0, 3, 4, 6 month delay), and ML was again performed. The OPs saw improved R² values in the range of 0.2 to 0.93. This approach can be used to get periodic analyses of overall wetland health with satellite indices. It proves that remote sensing can be used to develop correlations with critical parameters that measure eutrophication in situ data and can be used by practitioners to easily monitor wetland health.Keywords: estuary, remote sensing, machine learning, Fourier transform
Procedia PDF Downloads 10423936 Digital Control Algorithm Based on Delta-Operator for High-Frequency DC-DC Switching Converters
Authors: Renkai Wang, Tingcun Wei
Abstract:
In this paper, a digital control algorithm based on delta-operator is presented for high-frequency digitally-controlled DC-DC switching converters. The stability and the controlling accuracy of the DC-DC switching converters are improved by using the digital control algorithm based on delta-operator without increasing the hardware circuit scale. The design method of voltage compensator in delta-domain using PID (Proportion-Integration- Differentiation) control is given in this paper, and the simulation results based on Simulink platform are provided, which have verified the theoretical analysis results very well. It can be concluded that, the presented control algorithm based on delta-operator has better stability and controlling accuracy, and easier hardware implementation than the existed control algorithms based on z-operator, therefore it can be used for the voltage compensator design in high-frequency digitally- controlled DC-DC switching converters.Keywords: digitally-controlled DC-DC switching converter, digital voltage compensator, delta-operator, finite word length, stability
Procedia PDF Downloads 41223935 Agricultural Water Consumption Estimation in the Helmand Basin
Authors: Mahdi Akbari, Ali Torabi Haghighi
Abstract:
Hamun Lakes, located in the Helmand Basin, consisting of four water bodies, were the greatest (>8500 km2) freshwater bodies in Iran plateau but have almost entirely desiccated over the last 20 years. The desiccation of the lakes caused dust storm in the region which has huge economic and health consequences on the inhabitants. The flow of the Hirmand (or Helmand) River, the most important feeding river, has decreased from 4 to 1.9 km3 downstream due to anthropogenic activities. In this basin, water is mainly consumed for farming. Due to the lack of in-situ data in the basin, this research utilizes remote-sensing data to show how croplands and consequently consumed water in the agricultural sector have changed. Based on Landsat NDVI, we suggest using a threshold of around 0.35-0.4 to detect croplands in the basin. Croplands of this basin has doubled since 1990, especially in the downstream of the Kajaki Dam (the biggest dam of the basin). Using PML V2 Actual Evapotranspiration (AET) data and considering irrigation efficiency (≈0.3), we estimate that the consumed water (CW) for farming. We found that CW has increased from 2.5 to over 7.5 km3 from 2002 to 2017 in this basin. Also, the annual average Potential Evapotranspiration (PET) of the basin has had a negative trend in the recent years, although the AET over croplands has an increasing trend. In this research, using remote sensing data, we covered lack of data in the studied area and highlighted anthropogenic activities in the upstream which led to the lakes desiccation in the downstream.Keywords: Afghanistan-Iran transboundary Basin, Iran-Afghanistan water treaty, water use, lake desiccation
Procedia PDF Downloads 13023934 Attachment Systems and Psychotherapy: An Internal Secure Caregiver to Heal and Protect the Parts of Our Clients: InCorporer Method
Authors: Julien Baillet
Abstract:
In light of 30 years of scientific research, InCorporer Method was created in 2019 as a new approach to heal traumatic, developmental, and dissociative injuries. Following natural nervous system functions, InCorporer aims to heal, develop, and update the different defensive mammalian subsystems: fight, flight, freeze, feign death, cry for help, & energy regulator. The dimensions taken into account are: (i) Heal the traumatic injuries who are still bleeding, (ii) Develop the systems that never received the security, attention, and affection they needed. (iii) Update the parts that stayed stuck in the past, ignoring for too long that they are out of danger now. Through the Present Part and its caregiving skills, InCorporer method enables a balanced, soothed, and collaborative personality system. To be as integrative as possible, InCorporer method has been designed according to several fields of research, such as structural dissociation theory, attachment theory, and information processing theory. In this paper, the author presents how the internal caregiver is developed and trained to heal all the different parts/subsystems of our clients through mindful attention and reflex movement integration.Keywords: PTSD, attachment, dissociation, part work
Procedia PDF Downloads 7723933 Data-Driven Strategies for Enhancing Food Security in Vulnerable Regions: A Multi-Dimensional Analysis of Crop Yield Predictions, Supply Chain Optimization, and Food Distribution Networks
Authors: Sulemana Ibrahim
Abstract:
Food security remains a paramount global challenge, with vulnerable regions grappling with issues of hunger and malnutrition. This study embarks on a comprehensive exploration of data-driven strategies aimed at ameliorating food security in such regions. Our research employs a multifaceted approach, integrating data analytics to predict crop yields, optimizing supply chains, and enhancing food distribution networks. The study unfolds as a multi-dimensional analysis, commencing with the development of robust machine learning models harnessing remote sensing data, historical crop yield records, and meteorological data to foresee crop yields. These predictive models, underpinned by convolutional and recurrent neural networks, furnish critical insights into anticipated harvests, empowering proactive measures to confront food insecurity. Subsequently, the research scrutinizes supply chain optimization to address food security challenges, capitalizing on linear programming and network optimization techniques. These strategies intend to mitigate loss and wastage while streamlining the distribution of agricultural produce from field to fork. In conjunction, the study investigates food distribution networks with a particular focus on network efficiency, accessibility, and equitable food resource allocation. Network analysis tools, complemented by data-driven simulation methodologies, unveil opportunities for augmenting the efficacy of these critical lifelines. This study also considers the ethical implications and privacy concerns associated with the extensive use of data in the realm of food security. The proposed methodology outlines guidelines for responsible data acquisition, storage, and usage. The ultimate aspiration of this research is to forge a nexus between data science and food security policy, bestowing actionable insights to mitigate the ordeal of food insecurity. The holistic approach converging data-driven crop yield forecasts, optimized supply chains, and improved distribution networks aspire to revitalize food security in the most vulnerable regions, elevating the quality of life for millions worldwide.Keywords: data-driven strategies, crop yield prediction, supply chain optimization, food distribution networks
Procedia PDF Downloads 6223932 A Statistical Approach to Classification of Agricultural Regions
Authors: Hasan Vural
Abstract:
Turkey is a favorable country to produce a great variety of agricultural products because of her different geographic and climatic conditions which have been used to divide the country into four main and seven sub regions. This classification into seven regions traditionally has been used in order to data collection and publication especially related with agricultural production. Afterwards, nine agricultural regions were considered. Recently, the governmental body which is responsible of data collection and dissemination (Turkish Institute of Statistics-TIS) has used 12 classes which include 11 sub regions and Istanbul province. This study aims to evaluate these classification efforts based on the acreage of ten main crops in a ten years time period (1996-2005). The panel data grouped in 11 subregions has been evaluated by cluster and multivariate statistical methods. It was concluded that from the agricultural production point of view, it will be rather meaningful to consider three main and eight sub-agricultural regions throughout the country.Keywords: agricultural region, factorial analysis, cluster analysis,
Procedia PDF Downloads 41623931 Automatic Thresholding for Data Gap Detection for a Set of Sensors in Instrumented Buildings
Authors: Houda Najeh, Stéphane Ploix, Mahendra Pratap Singh, Karim Chabir, Mohamed Naceur Abdelkrim
Abstract:
Building systems are highly vulnerable to different kinds of faults and failures. In fact, various faults, failures and human behaviors could affect the building performance. This paper tackles the detection of unreliable sensors in buildings. Different literature surveys on diagnosis techniques for sensor grids in buildings have been published but all of them treat only bias and outliers. Occurences of data gaps have also not been given an adequate span of attention in the academia. The proposed methodology comprises the automatic thresholding for data gap detection for a set of heterogeneous sensors in instrumented buildings. Sensor measurements are considered to be regular time series. However, in reality, sensor values are not uniformly sampled. So, the issue to solve is from which delay each sensor become faulty? The use of time series is required for detection of abnormalities on the delays. The efficiency of the method is evaluated on measurements obtained from a real power plant: an office at Grenoble Institute of technology equipped by 30 sensors.Keywords: building system, time series, diagnosis, outliers, delay, data gap
Procedia PDF Downloads 24523930 Unique Interprofessional Mental Health Education Model: A Pre/Post Survey
Authors: Michele L. Tilstra, Tiffany J. Peets
Abstract:
Interprofessional collaboration in behavioral healthcare education is increasingly recognized for its value in training students to address diverse client needs. While interprofessional education (IPE) is well-documented in occupational therapy education to address physical health, limited research exists on collaboration with counselors to address mental health concerns and the psychosocial needs of individuals receiving care. Counseling education literature primarily examines the collaboration of counseling students with psychiatrists, psychologists, social workers, and marriage and family therapists. This pretest/posttest survey research study explored changes in attitudes toward interprofessional teams among 56 Master of Occupational Therapy (MOT) (n = 42) and Counseling and Human Development (CHD) (n = 14) students participating in the Counselors and Occupational Therapists Professionally Engaged in the Community (COPE) program. The COPE program was designed to strengthen the behavioral health workforce in high-need and high-demand areas. Students accepted into the COPE program were divided into small MOT/CHD groups to complete multiple interprofessional multicultural learning modules using videos, case studies, and online discussion board posts. The online modules encouraged reflection on various behavioral healthcare roles, benefits of team-based care, cultural humility, current mental health challenges, personal biases, power imbalances, and advocacy for underserved populations. Using the Student Perceptions of Interprofessional Clinical Education- Revision 2 (SPICE-R2) scale, students completed pretest and posttest surveys using a 5-point Likert scale (Strongly Agree = 5 to Strongly Disagree = 1) to evaluate their attitudes toward interprofessional teamwork and collaboration. The SPICE-R2 measured three different factors: interprofessional teamwork and team-based practice (Team), roles/responsibilities for collaborative practice (Roles), and patient outcomes from collaborative practice (Outcomes). The mean total scores for all students improved from 4.25 (pretest) to 4.43 (posttest), Team from 4.66 to 4.58, Roles from 3.88 to 4.30, and Outcomes from 4.08 to 4.36. A paired t-test analysis for the total mean scores resulted in a t-statistic of 2.54, which exceeded both one-tail and two-tail critical values, indicating statistical significance (p = .001). When the factors of the SPICE-R2 were analyzed separately, only the Roles (t Stat=4.08, p =.0001) and Outcomes (t Stat=3.13, p = .002) were statistically significant. The item ‘I understand the roles of other health professionals’ showed the most improvement from a mean score for all students of 3.76 (pretest) to 4.46 (posttest). The significant improvement in students' attitudes toward interprofessional teams suggests that the unique integration of OT and CHD students in the COPE program effectively develops a better understanding of the collaborative roles necessary for holistic client care. These results support the importance of IPE through structured, engaging interprofessional experiences. These experiences are essential for enhancing students' readiness for collaborative practice and align with accreditation standards requiring interprofessional education in OT and CHD programs to prepare practitioners for team-based care. The findings contribute to the growing body of evidence supporting the integration of IPE in behavioral healthcare curricula to improve holistic client care and encourage students to engage in collaborative practice across healthcare settings.Keywords: behavioral healthcare, counseling education, interprofessional education, mental health education, occupational therapy education
Procedia PDF Downloads 3923929 Artificial Reproduction System and Imbalanced Dataset: A Mendelian Classification
Authors: Anita Kushwaha
Abstract:
We propose a new evolutionary computational model called Artificial Reproduction System which is based on the complex process of meiotic reproduction occurring between male and female cells of the living organisms. Artificial Reproduction System is an attempt towards a new computational intelligence approach inspired by the theoretical reproduction mechanism, observed reproduction functions, principles and mechanisms. A reproductive organism is programmed by genes and can be viewed as an automaton, mapping and reducing so as to create copies of those genes in its off springs. In Artificial Reproduction System, the binding mechanism between male and female cells is studied, parameters are chosen and a network is constructed also a feedback system for self regularization is established. The model then applies Mendel’s law of inheritance, allele-allele associations and can be used to perform data analysis of imbalanced data, multivariate, multiclass and big data. In the experimental study Artificial Reproduction System is compared with other state of the art classifiers like SVM, Radial Basis Function, neural networks, K-Nearest Neighbor for some benchmark datasets and comparison results indicates a good performance.Keywords: bio-inspired computation, nature- inspired computation, natural computing, data mining
Procedia PDF Downloads 27223928 Critical Evaluation and Analysis of Effects of Different Queuing Disciplines on Packets Delivery and Delay for Different Applications
Authors: Omojokun Gabriel Aju
Abstract:
Communication network is a process of exchanging data between two or more devices via some forms of transmission medium using communication protocols. The data could be in form of text, images, audio, video or numbers which can be grouped into FTP, Email, HTTP, VOIP or Video applications. The effectiveness of such data exchange will be proved if they are accurately delivered within specified time. While some senders will not really mind when the data is actually received by the receiving device, inasmuch as it is acknowledged to have been received by the receiver. The time a data takes to get to a receiver could be very important to another sender, as any delay could cause serious problem or even in some cases rendered the data useless. The validity or invalidity of a data after delay will therefore definitely depend on the type of data (information). It is therefore imperative for the network device (such as router) to be able to differentiate among the packets which are time sensitive and those that are not, when they are passing through the same network. So, here is where the queuing disciplines comes to play, to handle network resources when such network is designed to service widely varying types of traffics and manage the available resources according to the configured policies. Therefore, as part of the resources allocation mechanisms, a router within the network must implement some queuing discipline that governs how packets (data) are buffered while waiting to be transmitted. The implementation of the queuing discipline will regulate how the packets are buffered while waiting to be transmitted. In achieving this, various queuing disciplines are being used to control the transmission of these packets, by determining which of the packets get the highest priority, less priority and which packets are dropped. The queuing discipline will therefore control the packets latency by determining how long a packet can wait to be transmitted or dropped. The common queuing disciplines are first-in-first-out queuing, Priority queuing and Weighted-fair queuing (FIFO, PQ and WFQ). This paper critically evaluates and analyse through the use of Optimized Network Evaluation Tool (OPNET) Modeller, Version 14.5 the effects of three queuing disciplines (FIFO, PQ and WFQ) on the performance of 5 different applications (FTP, HTTP, E-Mail, Voice and Video) within specified parameters using packets sent, packets received and transmission delay as performance metrics. The paper finally suggests some ways in which networks can be designed to provide better transmission performance while using these queuing disciplines.Keywords: applications, first-in-first-out queuing (FIFO), optimised network evaluation tool (OPNET), packets, priority queuing (PQ), queuing discipline, weighted-fair queuing (WFQ)
Procedia PDF Downloads 35823927 Data Confidentiality in Public Cloud: A Method for Inclusion of ID-PKC Schemes in OpenStack Cloud
Authors: N. Nalini, Bhanu Prakash Gopularam
Abstract:
The term data security refers to the degree of resistance or protection given to information from unintended or unauthorized access. The core principles of information security are the confidentiality, integrity and availability, also referred as CIA triad. Cloud computing services are classified as SaaS, IaaS and PaaS services. With cloud adoption the confidential enterprise data are moved from organization premises to untrusted public network and due to this the attack surface has increased manifold. Several cloud computing platforms like OpenStack, Eucalyptus, Amazon EC2 offer users to build and configure public, hybrid and private clouds. While the traditional encryption based on PKI infrastructure still works in cloud scenario, the management of public-private keys and trust certificates is difficult. The Identity based Public Key Cryptography (also referred as ID-PKC) overcomes this problem by using publicly identifiable information for generating the keys and works well with decentralized systems. The users can exchange information securely without having to manage any trust information. Another advantage is that access control (role based access control policy) information can be embedded into data unlike in PKI where it is handled by separate component or system. In OpenStack cloud platform the keystone service acts as identity service for authentication and authorization and has support for public key infrastructure for auto services. In this paper, we explain OpenStack security architecture and evaluate the PKI infrastructure piece for data confidentiality. We provide method to integrate ID-PKC schemes for securing data while in transit and stored and explain the key measures for safe guarding data against security attacks. The proposed approach uses JPBC crypto library for key-pair generation based on IEEE P1636.3 standard and secure communication to other cloud services.Keywords: data confidentiality, identity based cryptography, secure communication, open stack key stone, token scoping
Procedia PDF Downloads 384