Search results for: outputs
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 462

Search results for: outputs

102 A Statistical-Algorithmic Approach for the Design and Evaluation of a Fresnel Solar Concentrator-Receiver System

Authors: Hassan Qandil

Abstract:

Using a statistical algorithm incorporated in MATLAB, four types of non-imaging Fresnel lenses are designed; spot-flat, linear-flat, dome-shaped and semi-cylindrical-shaped. The optimization employs a statistical ray-tracing methodology of the incident light, mainly considering effects of chromatic aberration, varying focal lengths, solar inclination and azimuth angles, lens and receiver apertures, and the optimum number of prism grooves. While adopting an equal-groove-width assumption of the Poly-methyl-methacrylate (PMMA) prisms, the main target is to maximize the ray intensity on the receiver’s aperture and therefore achieving higher values of heat flux. The algorithm outputs prism angles and 2D sketches. 3D drawings are then generated via AutoCAD and linked to COMSOL Multiphysics software to simulate the lenses under solar ray conditions, which provides optical and thermal analysis at both the lens’ and the receiver’s apertures while setting conditions as per the Dallas-TX weather data. Once the lenses’ characterization is finalized, receivers are designed based on its optimized aperture size. Several cavity shapes; including triangular, arc-shaped and trapezoidal, are tested while coupled with a variety of receiver materials, working fluids, heat transfer mechanisms, and enclosure designs. A vacuum-reflective enclosure is also simulated for an enhanced thermal absorption efficiency. Each receiver type is simulated via COMSOL while coupled with the optimized lens. A lab-scale prototype for the optimum lens-receiver configuration is then fabricated for experimental evaluation. Application-based testing is also performed for the selected configuration, including that of a photovoltaic-thermal cogeneration system and solar furnace system. Finally, some future research work is pointed out, including the coupling of the collector-receiver system with an end-user power generator, and the use of a multi-layered genetic algorithm for comparative studies.

Keywords: COMSOL, concentrator, energy, fresnel, optics, renewable, solar

Procedia PDF Downloads 152
101 Inversion of PROSPECT+SAIL Model for Estimating Vegetation Parameters from Hyperspectral Measurements with Application to Drought-Induced Impacts Detection

Authors: Bagher Bayat, Wouter Verhoef, Behnaz Arabi, Christiaan Van der Tol

Abstract:

The aim of this study was to follow the canopy reflectance patterns in response to soil water deficit and to detect trends of changes in biophysical and biochemical parameters of grass (Poa pratensis species). We used visual interpretation, imaging spectroscopy and radiative transfer model inversion to monitor the gradual manifestation of water stress effects in a laboratory setting. Plots of 21 cm x 14.5 cm surface area with Poa pratensis plants that formed a closed canopy were subjected to water stress for 50 days. In a regular weekly schedule, canopy reflectance was measured. In addition, Leaf Area Index (LAI), Chlorophyll (a+b) content (Cab) and Leaf Water Content (Cw) were measured at regular time intervals. The 1-D bidirectional canopy reflectance model SAIL, coupled with the leaf optical properties model PROSPECT, was inverted using hyperspectral measurements by means of an iterative optimization method to retrieve vegetation biophysical and biochemical parameters. The relationships between retrieved LAI, Cab, Cw, and Cs (Senescent material) with soil moisture content were established in two separated groups; stress and non-stressed. To differentiate the water stress condition from the non-stressed condition, a threshold was defined that was based on the laboratory produced Soil Water Characteristic (SWC) curve. All parameters retrieved by model inversion using canopy spectral data showed good correlation with soil water content in the water stress condition. These parameters co-varied with soil moisture content under the stress condition (Chl: R2= 0.91, Cw: R2= 0.97, Cs: R2= 0.88 and LAI: R2=0.48) at the canopy level. To validate the results, the relationship between vegetation parameters that were measured in the laboratory and soil moisture content was established. The results were totally in agreement with the modeling outputs and confirmed the results produced by radiative transfer model inversion and spectroscopy. Since water stress changes all parts of the spectrum, we concluded that analysis of the reflectance spectrum in the VIS-NIR-MIR region is a promising tool for monitoring water stress impacts on vegetation.

Keywords: hyperspectral remote sensing, model inversion, vegetation responses, water stress

Procedia PDF Downloads 223
100 Defence Ethics : A Performance Measurement Framework for the Defence Ethics Program

Authors: Allyson Dale, Max Hlywa

Abstract:

The Canadian public expects the highest moral standards from Canadian Armed Forces (CAF) members and Department of National Defence (DND) employees. The Chief, Professional Conduct and Culture (CPCC) stood up in April 2021 with the mission of ensuring that the defence culture and members’ conduct are aligned with the ethical principles and values that the organization aspires towards. The Defence Ethics Program (DEP), which stood up in 1997, is a values-based ethics program for individuals and organizations within the DND/CAF and now falls under CPCC. The DEP is divided into five key functional areas, including policy, communications, collaboration, training and education, and advice and guidance. The main focus of the DEP is to foster an ethical culture within defence so that members and organizations perform to the highest ethical standards. The measurement of organizational ethics is often complex and challenging. In order to monitor whether the DEP is achieving its intended outcomes, a performance measurement framework (PMF) was developed using the Director General Military Personnel Research and Analysis (DGMPRA) PMF development process. This evidence-based process is based on subject-matter expertise from the defence team. The goal of this presentation is to describe each stage of the DGMPRA PMF development process and to present and discuss the products of the DEP PMF (e.g., logic model). Specifically, first, a strategic framework was developed to provide a high-level overview of the strategic objectives, mission, and vision of the DEP. Next, Key Performance Questions were created based on the objectives in the strategic framework. A logic model detailing the activities, outputs (what is produced by the program activities), and intended outcomes of the program were developed to demonstrate how the program works. Finally, Key Performance Indicators were developed based on both the intended outcomes in the logic model and the Key Performance Questions in order to monitor program effectiveness. The Key Performance Indicators measure aspects of organizational ethics such as ethical conduct and decision-making, DEP collaborations, and knowledge and awareness of the Defence Ethics Code while leveraging ethics-related items from multiple DGMPRA surveys where appropriate.

Keywords: defence ethics, ethical culture, organizational performance, performance measurement framework

Procedia PDF Downloads 102
99 Thermal Evaluation of Printed Circuit Board Design Options and Voids in Solder Interface by a Simulation Tool

Authors: B. Arzhanov, A. Correia, P. Delgado, J. Meireles

Abstract:

Quad Flat No-Lead (QFN) packages have become very popular for turners, converters and audio amplifiers, among others applications, needing efficient power dissipation in small footprints. Since semiconductor junction temperature (TJ) is a critical parameter in the product quality. And to ensure that die temperature does not exceed the maximum allowable TJ, a thermal analysis conducted in an earlier development phase is essential to avoid repeated re-designs process with huge losses in cost and time. A simulation tool capable to estimate die temperature of components with QFN package was developed. Allow establish a non-empirical way to define an acceptance criterion for amount of voids in solder interface between its exposed pad and Printed Circuit Board (PCB) to be applied during industrialization process, and evaluate the impact of PCB designs parameters. Targeting PCB layout designer as an end user for the application, a user-friendly interface (GUI) was implemented allowing user to introduce design parameters in a convenient and secure way and hiding all the complexity of finite element simulation process. This cost effective tool turns transparent a simulating process and provides useful outputs after acceptable time, which can be adopted by PCB designers, preventing potential risks during the design stage and make product economically efficient by not oversizing it. This article gathers relevant information related to the design and implementation of the developed tool, presenting a parametric study conducted with it. The simulation tool was experimentally validated using a Thermal-Test-Chip (TTC) in a QFN open-cavity, in order to measure junction temperature (TJ) directly on the die under controlled and knowing conditions. Providing a short overview about standard thermal solutions and impacts in exposed pad packages (i.e. QFN), accurately describe the methods and techniques that the system designer should use to achieve optimum thermal performance, and demonstrate the effect of system-level constraints on the thermal performance of the design.

Keywords: QFN packages, exposed pads, junction temperature, thermal management and measurements

Procedia PDF Downloads 253
98 A Geosynchronous Orbit Synthetic Aperture Radar Simulator for Moving Ship Targets

Authors: Linjie Zhang, Baifen Ren, Xi Zhang, Genwang Liu

Abstract:

Ship detection is of great significance for both military and civilian applications. Synthetic aperture radar (SAR) with all-day, all-weather, ultra-long-range characteristics, has been used widely. In view of the low time resolution of low orbit SAR and the needs for high time resolution SAR data, GEO (Geosynchronous orbit) SAR is getting more and more attention. Since GEO SAR has short revisiting period and large coverage area, it is expected to be well utilized in marine ship targets monitoring. However, the height of the orbit increases the time of integration by almost two orders of magnitude. For moving marine vessels, the utility and efficacy of GEO SAR are still not sure. This paper attempts to find the feasibility of GEO SAR by giving a GEO SAR simulator of moving ships. This presented GEO SAR simulator is a kind of geometrical-based radar imaging simulator, which focus on geometrical quality rather than high radiometric. Inputs of this simulator are 3D ship model (.obj format, produced by most 3D design software, such as 3D Max), ship's velocity, and the parameters of satellite orbit and SAR platform. Its outputs are simulated GEO SAR raw signal data and SAR image. This simulating process is accomplished by the following four steps. (1) Reading 3D model, including the ship rotations (pitch, yaw, and roll) and velocity (speed and direction) parameters, extract information of those little primitives (triangles) which is visible from the SAR platform. (2) Computing the radar scattering from the ship with physical optics (PO) method. In this step, the vessel is sliced into many little rectangles primitives along the azimuth. The radiometric calculation of each primitive is carried out separately. Since this simulator only focuses on the complex structure of ships, only single-bounce reflection and double-bounce reflection are considered. (3) Generating the raw data with GEO SAR signal modeling. Since the normal ‘stop and go’ model is not available for GEO SAR, the range model should be reconsidered. (4) At last, generating GEO SAR image with improved Range Doppler method. Numerical simulation of fishing boat and cargo ship will be given. GEO SAR images of different posture, velocity, satellite orbit, and SAR platform will be simulated. By analyzing these simulated results, the effectiveness of GEO SAR for the detection of marine moving vessels is evaluated.

Keywords: GEO SAR, radar, simulation, ship

Procedia PDF Downloads 175
97 End-Users Tools to Empower and Raise Awareness of Behavioural Change towards Energy Efficiency

Authors: G. Calleja-Rodriguez, N. Jimenez-Redondo, J. J. Peralta Escalante

Abstract:

This research work aims at developing a solution to take advantage of the potential energy saving related to occupants behaviour estimated in between 5-30 % according to existing studies. For that purpose, the following methodology has been followed: 1) literature review and gap analysis, 2) define concept and functional requirements, 3) evaluation and feedback by experts. As result, the concept for a tool-box that implements continuous behavior change interventions named as engagement methods and based on increasing energy literacy, increasing energy visibility, using bonus system, etc. has been defined. These engagement methods are deployed through a set of ICT tools: Building Automation and Control System (BACS) add-ons services installed in buildings and Users Apps installed in smartphones, smart-TVs or dashboards. The tool-box called eTEACHER identifies energy conservation measures (ECM) based on energy behavioral change through a what-if analysis that collects information about the building and its users (comfort feedback, behavior, etc.) and carry out cost-effective calculations to provide outputs such us efficient control settings of building systems. This information is processed and showed in an attractive way as tailored advice to the energy end-users. Therefore, eTEACHER goal is to change the behavior of building´s energy users towards energy efficiency, comfort and better health conditions by deploying customized ICT-based interventions taking into account building typology (schools, residential, offices, health care centres, etc.), users profile (occupants, owners, facility managers, employers, etc.) as well as cultural and demographic factors. One of the main findings of this work is the common failure when technological interventions on behavioural change are done to not consult, train and support users regarding technological changes leading to poor performance in practices. As conclusion, a strong need to carry out social studies to identify relevant behavioural issues and to identify effective pro-evironmental behavioral change strategies has been identified.

Keywords: energy saving, behavioral bhange, building users, engagement methods, energy conservation measures

Procedia PDF Downloads 170
96 An Assessment of the Impacts of Agro-Ecological Practices towards the Improvement of Crop Health and Yield Capacity: A Case of Mopani District, Limpopo, South Africa

Authors: Tshilidzi C. Manyanya, Nthaduleni S. Nethengwe, Edmore Kori

Abstract:

The UNFCCC, FAO, GCF, IPCC and other global structures advocate for agro-ecology do address food security and sovereignty. However, most of the expected outcomes concerning agro-ecological were not empirically tested for universal application. Agro-ecology is theorised to increase crop health over ago-ecological farms and decrease over conventional farms. Increased crop health means increased carbon sequestration and thus less CO2 in the atmosphere. This is in line with the view that global warming is anthropogenically enhanced through GHG emissions. Agro-ecology mainly affects crop health, soil carbon content and yield on the cultivated land. Economic sustainability is directly related to yield capacity, which is theorized to increase by 3-10% in a space of 3 - 10 years as a result of agro-ecological implementation. This study aimed to empirically assess the practicality and validity of these assumptions. The study utilized mainly GIS and RS techniques to assess the effectiveness of agro-ecology in crop health improvement from satellite images. The assessment involved a longitudinal study (2013 – 2015) assessing the changes that occur after a farm retrofits from conventional agriculture to agro-ecology. The assumptions guided the objectives of the study. For each objective, an agro-ecological farm was compared with a conventional farm in the same climatic conditional occupying the same general location. Crop health was assessed using satellite images analysed through ArcGIS and Erdas. This entailed the production of NDVI and Re-classified outputs of the farm area. The NDVI ranges of the entire period of study were thus compared in a stacked histogram for each farm to assess for trends. Yield capacity was calculated based on the production records acquired from the farmers and plotted in a stacked bar graph as percentages of a total for each farm. The results of the study showed decreasing crop health trends over 80% of the conventional farms and an increase over 80% of the organic farms. Yield capacity showed similar patterns to those of crop health. The study thus showed that agro-ecology is an effective strategy for crop-health improvement and yield increase.

Keywords: agro-ecosystem, conventional farm, dialectical, sustainability

Procedia PDF Downloads 215
95 Cluster Analysis and Benchmarking for Performance Optimization of a Pyrochlore Processing Unit

Authors: Ana C. R. P. Ferreira, Adriano H. P. Pereira

Abstract:

Given the frequent variation of mineral properties throughout the Araxá pyrochlore deposit, even if a good homogenization work has been carried out before feeding the processing plants, an operation with quality and performance’s high variety standard is expected. These results could be improved and standardized if the blend composition parameters that most influence the processing route are determined, and then the types of raw materials are grouped by them, finally presenting a great reference with operational settings for each group. Associating the physical and chemical parameters of a unit operation through benchmarking or even an optimal reference of metallurgical recovery and product quality reflects in the reduction of the production costs, optimization of the mineral resource, and guarantee of greater stability in the subsequent processes of the production chain that uses the mineral of interest. Conducting a comprehensive exploratory data analysis to identify which characteristics of the ore are most relevant to the process route, associated with the use of Machine Learning algorithms for grouping the raw material (ore) and associating these with reference variables in the process’ benchmark is a reasonable alternative for the standardization and improvement of mineral processing units. Clustering methods through Decision Tree and K-Means were employed, associated with algorithms based on the theory of benchmarking, with criteria defined by the process team in order to reference the best adjustments for processing the ore piles of each cluster. A clean user interface was created to obtain the outputs of the created algorithm. The results were measured through the average time of adjustment and stabilization of the process after a new pile of homogenized ore enters the plant, as well as the average time needed to achieve the best processing result. Direct gains from the metallurgical recovery of the process were also measured. The results were promising, with a reduction in the adjustment time and stabilization when starting the processing of a new ore pile, as well as reaching the benchmark. Also noteworthy are the gains in metallurgical recovery, which reflect a significant saving in ore consumption and a consequent reduction in production costs, hence a more rational use of the tailings dams and life optimization of the mineral deposit.

Keywords: mineral clustering, machine learning, process optimization, pyrochlore processing

Procedia PDF Downloads 143
94 Urban Logistics Dynamics: A User-Centric Approach to Traffic Modelling and Kinetic Parameter Analysis

Authors: Emilienne Lardy, Eric Ballot, Mariam Lafkihi

Abstract:

Efficient urban logistics requires a comprehensive understanding of traffic dynamics, particularly as it pertains to kinetic parameters influencing energy consumption and trip duration estimations. While real-time traffic information is increasingly accessible, current high-precision forecasting services embedded in route planning often function as opaque 'black boxes' for users. These services, typically relying on AI-processed counting data, fall short in accommodating open design parameters essential for management studies, notably within Supply Chain Management. This work revisits the modelling of traffic conditions in the context of city logistics, emphasizing its significance from the user’s point of view, with two focuses. Firstly, the focus is not on the vehicle flow but on the vehicles themselves and the impact of the traffic conditions on their driving behaviour. This means opening the range of studied indicators beyond vehicle speed, to describe extensively the kinetic and dynamic aspects of the driving behaviour. To achieve this, we leverage the Art. Kinema parameters are designed to characterize driving cycles. Secondly, this study examines how the driving context (i.e., exogenous factors to the traffic flow) determines the mentioned driving behaviour. Specifically, we explore how accurately the kinetic behaviour of a vehicle can be predicted based on a limited set of exogenous factors, such as time, day, road type, orientation, slope, and weather conditions. To answer this question, statistical analysis was conducted on real-world driving data, which includes high-frequency measurements of vehicle speed. A Factor Analysis and a Generalized Linear Model have been established to link kinetic parameters with independent categorical contextual variables. The results include an assessment of the adjustment quality and the robustness of the models, as well as an overview of the model’s outputs.

Keywords: factor analysis, generalised linear model, real world driving data, traffic congestion, urban logistics, vehicle kinematics

Procedia PDF Downloads 63
93 Establishing a Communication Framework in Response to the COVID-19 Pandemic in a Tertiary Government Hospital in the Philippines

Authors: Nicole Marella G. Tan, Al Joseph R. Molina, Raisa Celine R. Rosete, Soraya Elisse E. Escandor, Blythe N. Ke, Veronica Marie E. Ramos, Apolinario Ericson B. Berberabe, Jose Jonas D. del Rosario, Regina Pascua-Berba, Eileen Liesl A. Cubillan, Winlove P. Mojica

Abstract:

Emergency risk and health communications play a vital role in any pandemic response. However, the Philippine General Hospital (PGH) lacked a system of information delivery that could effectively fulfill the hospital’s communication needs as a COVID-19 referral hospital. This study aimed to describe the establishment of a communication framework for information dissemination within a tertiary government hospital during the COVID-19 pandemic and evaluated the perceived usefulness of its outputs. This is a mixed quantitative-qualitative study with two phases. Phase 1 documented the formation and responsibilities of the Information Education Communication (IEC) Committee. Phase 2 evaluated its output and outcomes through a hospital-wide survey of 528 healthcare workers (HCWs) using a pre-tested questionnaire. In-depth explanations were obtained from five focused group discussions (FGD) amongst various HCW subgroups. Descriptive analysis was done using STATA 16 while qualitative data were synthesized thematically. Communication practices in PGH were loosely structured at the beginning of the pandemic until the establishment of the IEC Committee. The IEC Committee was well-represented by concerned stakeholders. Nine types of infographics tackled different aspects of the hospital’s health operations after thorough inputs from concerned offices. Internal and external feedback mechanisms ensured accurate infographics. Majority of the survey respondents (98.67%) perceived these as useful in their work or daily lives. FGD participants cited the relevance of infographics to their occupations, suggested improvements, and hoped that these efforts would be continued in the future. Sustainability and comprehensive reach were the main concerns in this undertaking. The PGH COVID-19 IEC framework was developed through trial and testing as there were no existing formal structures to communicate health risks and to properly direct the HCWs in the chaotic time of a pandemic. It is a continuously evolving framework which is perceived as useful by HCWs and is hoped to be sustained in the future.

Keywords: COVID-19, pandemic, health communication, infographics, social media

Procedia PDF Downloads 123
92 Development of a New Margarine Added Date Seed Oil: Characteristics and Chemical Composition of Date Seed Oil

Authors: Hamitri-Guerfi Fatiha, Madani Khodir, Hadjal Samir, Kati Djamel, Youyou Ahcene

Abstract:

Date palm (Phoenix dactylifera) is a principal fruit that is grown in many regions of the world, resulting in a surplus production of dates. Algeria is considered to be one of the date producing countries. Date seeds (pits) have been a problem to the date industry as a waste stream. However, finding a way to make a profit on the pits would benefit date farmers substantially. This work concentrated on the valorization of date seed oils. A preliminary study was carried out on three varieties (soft, half soft, and dry) and we selected the dry variety. This work concerns the valorization of the date seed oil of the dry variety: ‘Mech Degla’ by its incorporation in a food formulation: margarine of table. Lipid extraction was carried out by hot extraction with the soxhlet; the extracts obtained are rich in fat contents, the results gave outputs of 13.21±0.21 %. The antioxidant activity of extracted oils was studied by the test of DPPH, the content polyphenols as well as the anti-radicalaire activity. The analysis of fatty acids was made by CPG. Thus, it comes out from our results that the recovered fat contents are interesting and considerable. A formulation of the margarine ‘BIO’ was elaborated on the scale industrialist by the addition of the extracts of date seeds ‘Mech-Degla’ oil in order to substitute a synthetic additive. The physicochemical characteristics of the elaborate margarines prove to be in conformity with the standards set by the Algerian companies. The texture of the elaborate margarine has an acceptable color, an aspect brilliant and homogeneous, it is plastic and easy to paste having an index of required SFC and the margarine melts easily in the mouth. Moreover, the evaluation of oxidative stability is carried out by the test of Rancimat. The result obtained reported that the margarine enriched with date seed oil, proved more resistant to oxidation, than the margarine without extract, which is improved much during incorporation of the extracts simultaneously. By conclusion, considering the content of polyphénols noted in the two extracts (aqueous and oily), we can exhort the scientific community to become aware of the treasures of our country especially the wonders of the south which are the dates and theirs under products (pits).

Keywords: antioxydant activity, date seed oil, quality characteristics, margarine

Procedia PDF Downloads 413
91 Data Model to Predict Customize Skin Care Product Using Biosensor

Authors: Ashi Gautam, Isha Shukla, Akhil Seghal

Abstract:

Biosensors are analytical devices that use a biological sensing element to detect and measure a specific chemical substance or biomolecule in a sample. These devices are widely used in various fields, including medical diagnostics, environmental monitoring, and food analysis, due to their high specificity, sensitivity, and selectivity. In this research paper, a machine learning model is proposed for predicting the suitability of skin care products based on biosensor readings. The proposed model takes in features extracted from biosensor readings, such as biomarker concentration, skin hydration level, inflammation presence, sensitivity, and free radicals, and outputs the most appropriate skin care product for an individual. This model is trained on a dataset of biosensor readings and corresponding skin care product information. The model's performance is evaluated using several metrics, including accuracy, precision, recall, and F1 score. The aim of this research is to develop a personalised skin care product recommendation system using biosensor data. By leveraging the power of machine learning, the proposed model can accurately predict the most suitable skin care product for an individual based on their biosensor readings. This is particularly useful in the skin care industry, where personalised recommendations can lead to better outcomes for consumers. The developed model is based on supervised learning, which means that it is trained on a labeled dataset of biosensor readings and corresponding skin care product information. The model uses these labeled data to learn patterns and relationships between the biosensor readings and skin care products. Once trained, the model can predict the most suitable skin care product for an individual based on their biosensor readings. The results of this study show that the proposed machine learning model can accurately predict the most appropriate skin care product for an individual based on their biosensor readings. The evaluation metrics used in this study demonstrate the effectiveness of the model in predicting skin care products. This model has significant potential for practical use in the skin care industry for personalised skin care product recommendations. The proposed machine learning model for predicting the suitability of skin care products based on biosensor readings is a promising development in the skin care industry. The model's ability to accurately predict the most appropriate skin care product for an individual based on their biosensor readings can lead to better outcomes for consumers. Further research can be done to improve the model's accuracy and effectiveness.

Keywords: biosensors, data model, machine learning, skin care

Procedia PDF Downloads 95
90 Reconstruction of Performace-Based Budgeting in Indonesian Local Government: Application of Soft Systems Methodology in Producing Guideline for Policy Implementation

Authors: Deddi Nordiawan

Abstract:

Effective public policy creation required a strong budget system, both in terms of design and implementation. Performance-based Budget is an evolutionary approach with two substantial characteristics; first, the strong integration between budgeting and planning, and second, its existence as guidance so that all activities and expenditures refer to measurable performance targets. There are four processes in the government that should be followed in order to make the budget become performance-based. These four processes consist of the preparation of a vision according to the bold aspiration, the formulation of outcome, the determination of output based on the analysis of organizational resources, and the formulation of Value Creation Map that contains a series of programs and activities. This is consistent with the concept of logic model which revealed that the budget performance should be placed within a relational framework of resources, activities, outputs, outcomes and impacts. Through the issuance of Law 17/2003 regarding State Finance, local governments in Indonesia have to implement performance-based budget. Central Government then issued Government Regulation 58/2005 which contains the detail guidelines how to prepare local governments budget. After a decade, implementation of performance budgeting in local government is still not fully meet expectations, though the guidance is completed, socialization routinely performed, and trainings have also been carried out at all levels. Accordingly, this study views the practice of performance-based budget at local governments as a problematic situation. This condition must be approached with a system approach that allows the solutions from many point of views. Based on the fact that the infrastructure of budgeting has already settled, the study then considering the situation as complexity. Therefore, the intervention needs to be done in the area of human activity system. Using Soft Systems Methodology, this research will reconstruct the process of performance-based budget at local governments is area of human activity system. Through conceptual models, this study will invite all actors (central government, local government, and the parliament) for dialogue and formulate interventions in human activity systems that systematically desirable and culturally feasible. The result will direct central government in revise the guidance to local government budgeting process as well as a reference to build the capacity building strategy.

Keywords: soft systems methodology, performance-based budgeting, Indonesia, public policy

Procedia PDF Downloads 252
89 Conservation of Sea Turtle in Cox’s Bazar- Teknaf Peninsula and Sonadia Island Ecologically Critical Area (ECA) of Bangladesh

Authors: Pronob Kumar Mozumder M. Nazrul Islam, M. Abdur Rob Mollah

Abstract:

This study was conducted in Cox’s Bazar-Teknaf Peninsula and Sonadia Island Ecologically Critical Areas during the period of October, 2011 to June, 2013. Six species of marine turtle are found in the Indian Ocean. Among them, olive ridley (Lepidochelys olivacea) listed as endangered in the IUCN Red List of Threatened Species. Marine turtle populations in the Indian Ocean have been depleted through long-term exploitation of eggs and adults, incidental capture (fisheries bycatch) and many other sources of mortality. The specific objective of the study was to conserve the sea turtles specially the olive ridley (Lepidochelys olivacea) with a view to contribute towards protection of the turtle species from extinction and to facilitate hatching of eggs through providing protection to turtle eggs or nest through ex-situ conservation efforts. In order to achieve the desired outputs and success, a total of five turtle hatcheries were established at Pechardwip, Khurermukh, Hazompara, Bodormokam, and Sonadia Eastpara sites. In total, 31,853 eggs were collected from 260 nests and were transferred to five hatcheries. The number of eggs/nest varied from 38 to 190 with an average clutch size of 122 eggs/ nest. Hatching of eggs took place during January to June with a peak in April. Sea turtle eggs were incubated by metabolic heat and the heat of the sun. The incubation period of turtle eggs in Cox’s Bazar-Teknaf Peninsula and Sonadia Island ECAs extended from 54 to 75 days depending on the month with an average of 66 days. During study period the temperature in the ECAs varied between 10.5-34.5°C. A total of 27,937 hatchlings of turtle were produced from the five hatcheries and all the hatchlings produced were released into the sea. Hatching rates varied from 74-98 % depending on the location and months with an average of 88 %. Sea turtles spend the majority of their lives in the sea, only emerging on beaches to nest. Despite the intense conservation efforts on the beaches, some populations have still declined to the edge of extinction. So proper conservation and awareness measure should be taken for prevention of turtle extinction.

Keywords: conservation of sea turtle, Bangladesh, ecologically critical area, ECA, Lepidochelys olivacea

Procedia PDF Downloads 512
88 Simulation of Optimum Sculling Angle for Adaptive Rowing

Authors: Pornthep Rachnavy

Abstract:

The purpose of this paper is twofold. First, we believe that there are a significant relationship between sculling angle and sculling style among adaptive rowing. Second, we introduce a methodology used for adaptive rowing, namely simulation, to identify effectiveness of adaptive rowing. For our study we simulate the arms only single scull of adaptive rowing. The method for rowing fastest under the 1000 meter was investigated by study sculling angle using the simulation modeling. A simulation model of a rowing system was developed using the Matlab software package base on equations of motion consist of many variation for moving the boat such as oars length, blade velocity and sculling style. The boat speed, power and energy consumption on the system were compute. This simulation modeling can predict the force acting on the boat. The optimum sculling angle was performing by computer simulation for compute the solution. Input to the model are sculling style of each rower and sculling angle. Outputs of the model are boat velocity at 1000 meter. The present study suggests that the optimum sculling angle exist depends on sculling styles. The optimum angle for blade entry and release with respect to the perpendicular through the pin of the first style is -57.00 and 22.0 degree. The optimum angle for blade entry and release with respect to the perpendicular through the pin of the second style is -57.00 and 22.0 degree. The optimum angle for blade entry and release with respect to the perpendicular through the pin of the third style is -51.57 and 28.65 degree. The optimum angle for blade entry and release with respect to the perpendicular through the pin of the fourth style is -45.84 and 34.38 degree. A theoretical simulation for rowing has been developed and presented. The results suggest that it may be advantageous for the rowers to select the sculling angles proper to sculling styles. The optimum sculling angles of the rower depends on the sculling styles made by each rower. The investigated of this paper can be concludes in three directions: 1;. There is the optimum sculling angle in arms only single scull of adaptive rowing. 2. The optimum sculling angles depend on the sculling styles. 3. Computer simulation of rowing can identify opportunities for improving rowing performance by utilizing the kinematic description of rowing. The freedom to explore alternatives in speed, thrust and timing with the computer simulation will provide the coach with a tool for systematic assessments of rowing technique In addition, the ability to use the computer to examine the very complex movements during rowing will help both the rower and the coach to conceptualize the components of movements that may have been previously unclear or even undefined.

Keywords: simulation, sculling, adaptive, rowing

Procedia PDF Downloads 464
87 Integrating Participatory Action and Arts-Based Research: A Methodology for Investigating Generative AI in Elementary Art Education

Authors: Jihane Mossalim

Abstract:

This study proposes a methodological framework that combines Participatory Action Research (PAR) with Arts-Based Research (ABR) to explore the potential of generative AI in elementary art education. By integrating PAR, this framework emphasizes elementary school students’ active participation as co-researchers, engaging with AI technologies and reflecting on their creative journeys. PAR’s iterative cycles of planning, action, observation, and reflection provide a solid structure for involving children in the research process, ensuring that the study is inclusive and reflective of the children’s perspectives. Arts-Based Research, on the other hand, allows for the exploration of AI not just as a tool but as a medium of creative expression. ABR’s emphasis on visual, performative, and creative outputs complements PAR’s inclusive approach, offering a dynamic and flexible way of studying the intersection of technology and art in educational contexts. This combination is particularly valuable as it encourages students to express their ideas and emotions through art, making the learning process more engaging and personally meaningful. Despite the recognized benefits of both PAR and ABR, there remains a notable gap in research that applies these methodologies in combination with elementary school students, particularly in the context of emerging technologies like generative AI. Addressing this gap is crucial, as integrating these approaches can lead to more inclusive and innovative educational practices that cater to the diverse needs of young learners. This chapter seeks to demonstrate how integrating PAR and ABR can empower young learners, giving them a voice in the research process while enriching their creative and critical thinking skills. This chapter will develop a methodology that integrates both theoretical and practical aspects of PAR and ABR, highlighting the challenges and opportunities that emerge when these approaches are integrated. It will also discuss how to adapt these methods for research in the elementary art education, providing a foundation for future inquiry. Further, the chapter will focus on situating these methodological developments in relation to a study that seeks to understand the potential of generative AI in fostering creativity, collaboration, and critical thinking among young learners. Ultimately, this work aims to provide a pioneering example that inspires further exploration and development of educational practices in the digital age.

Keywords: participatory action research, arts-based research, generative AI, elementary art education

Procedia PDF Downloads 24
86 The Contribution of Boards to Company Performance via Strategic Management

Authors: Peter Crow

Abstract:

Boards and directors have been subjects of much scholarly research and public interest over several decades, more so since the succession of high profile company failures of the early 2000s. An array of research outputs including information, correlations, descriptions, models, hypotheses and theories have been reported. While some of this research has shed light on aspects of the board–performance relationship and on board tasks and behaviours, the nature and characteristics of the supposed board–performance relationship remain undetermined. That satisfactory explanations of how boards influence company performance have yet to emerge is a significant blind spot. Yet the board is ultimately responsible for company performance, in accordance with the wishes of shareholders. The aim of this paper is to explore corporate governance and board practice through the lens of strategic management, and to take tentative steps towards a new conception of corporate governance. The findings of a recent longitudinal multiple-case study designed to explore the board’s involvement in strategic management are reported. Qualitative and quantitative data was collected from two quasi-public large companies in New Zealand including from first-hand observations of boards in session, semi-structured interviews with chief executives and chairmen and the inspection of company and board documentation. A synthetic timeline framework was used to collate the financial, board structure, board activity and decision-making data, in order to provide a holistic perspective. Decision sequences were identified, and realist techniques of abduction and retroduction were iteratively applied to analyse the multi-year data set. Using several models previously proposed in the literature as a guide, conjectures were formed, tested and refined—the culmination of which was a provisional model of how boards can influence performance via strategic management. The model builds on both existing theoretical perspectives and theoretical models proposed in the corporate governance and strategic management literature. This paper seeks to add to the understanding of how boards can make meaningful contributions to value creation via strategic management, and to comment on the qualities of directors, social interactions in boardrooms and other circumstances within which influence might be possible given the highly contingent relationship between board activity and business performance outcomes.

Keywords: board practice, case study, corporate governance, strategic management

Procedia PDF Downloads 224
85 Investigating Effects of Vehicle Speed and Road PSDs on Response of a 35-Ton Heavy Commercial Vehicle (HCV) Using Mathematical Modelling

Authors: Amal G. Kurian

Abstract:

The use of mathematical modeling has seen a considerable boost in recent times with the development of many advanced algorithms and mathematical modeling capabilities. The advantages this method has over other methods are that they are much closer to standard physics theories and thus represent a better theoretical model. They take lesser solving time and have the ability to change various parameters for optimization, which is a big advantage, especially in automotive industry. This thesis work focuses on a thorough investigation of the effects of vehicle speed and road roughness on a heavy commercial vehicle ride and structural dynamic responses. Since commercial vehicles are kept in operation continuously for longer periods of time, it is important to study effects of various physical conditions on the vehicle and its user. For this purpose, various experimental as well as simulation methodologies, are adopted ranging from experimental transfer path analysis to various road scenario simulations. To effectively investigate and eliminate several causes of unwanted responses, an efficient and robust technique is needed. Carrying forward this motivation, the present work focuses on the development of a mathematical model of a 4-axle configuration heavy commercial vehicle (HCV) capable of calculating responses of the vehicle on different road PSD inputs and vehicle speeds. Outputs from the model will include response transfer functions and PSDs and wheel forces experienced. A MATLAB code will be developed to implement the objectives in a robust and flexible manner which can be exploited further in a study of responses due to various suspension parameters, loading conditions as well as vehicle dimensions. The thesis work resulted in quantifying the effect of various physical conditions on ride comfort of the vehicle. An increase in discomfort is seen with velocity increase; also the effect of road profiles has a considerable effect on comfort of the driver. Details of dominant modes at each frequency are analysed and mentioned in work. The reduction in ride height or deflection of tire and suspension with loading along with load on each axle is analysed and it is seen that the front axle supports a greater portion of vehicle weight while more of payload weight comes on fourth and third axles. The deflection of the vehicle is seen to be well inside acceptable limits.

Keywords: mathematical modeling, HCV, suspension, ride analysis

Procedia PDF Downloads 255
84 KPI and Tool for the Evaluation of Competency in Warehouse Management for Furniture Business

Authors: Kritchakhris Na-Wattanaprasert

Abstract:

The objective of this research is to design and develop a prototype of a key performance indicator system this is suitable for warehouse management in a case study and use requirement. In this study, we design a prototype of key performance indicator system (KPI) for warehouse case study of furniture business by methodology in step of identify scope of the research and study related papers, gather necessary data and users requirement, develop key performance indicator base on balance scorecard, design pro and database for key performance indicator, coding the program and set relationship of database and finally testing and debugging each module. This study use Balance Scorecard (BSC) for selecting and grouping key performance indicator. The system developed by using Microsoft SQL Server 2010 is used to create the system database. In regard to visual-programming language, Microsoft Visual C# 2010 is chosen as the graphic user interface development tool. This system consists of six main menus: menu login, menu main data, menu financial perspective, menu customer perspective, menu internal, and menu learning and growth perspective. Each menu consists of key performance indicator form. Each form contains a data import section, a data input section, a data searches – edit section, and a report section. The system generates outputs in 5 main reports, the KPI detail reports, KPI summary report, KPI graph report, benchmarking summary report and benchmarking graph report. The user will select the condition of the report and period time. As the system has been developed and tested, discovers that it is one of the ways to judging the extent to warehouse objectives had been achieved. Moreover, it encourages the warehouse functional proceed with more efficiency. In order to be useful propose for other industries, can adjust this system appropriately. To increase the usefulness of the key performance indicator system, the recommendations for further development are as follows: -The warehouse should review the target value and set the better suitable target periodically under the situation fluctuated in the future. -The warehouse should review the key performance indicators and set the better suitable key performance indicators periodically under the situation fluctuated in the future for increasing competitiveness and take advantage of new opportunities.

Keywords: key performance indicator, warehouse management, warehouse operation, logistics management

Procedia PDF Downloads 430
83 A Comparative Study of Sampling-Based Uncertainty Propagation with First Order Error Analysis and Percentile-Based Optimization

Authors: M. Gulam Kibria, Shourav Ahmed, Kais Zaman

Abstract:

In system analysis, the information on the uncertain input variables cause uncertainty in the system responses. Different probabilistic approaches for uncertainty representation and propagation in such cases exist in the literature. Different uncertainty representation approaches result in different outputs. Some of the approaches might result in a better estimation of system response than the other approaches. The NASA Langley Multidisciplinary Uncertainty Quantification Challenge (MUQC) has posed challenges about uncertainty quantification. Subproblem A, the uncertainty characterization subproblem, of the challenge posed is addressed in this study. In this subproblem, the challenge is to gather knowledge about unknown model inputs which have inherent aleatory and epistemic uncertainties in them with responses (output) of the given computational model. We use two different methodologies to approach the problem. In the first methodology we use sampling-based uncertainty propagation with first order error analysis. In the other approach we place emphasis on the use of Percentile-Based Optimization (PBO). The NASA Langley MUQC’s subproblem A is developed in such a way that both aleatory and epistemic uncertainties need to be managed. The challenge problem classifies each uncertain parameter as belonging to one the following three types: (i) An aleatory uncertainty modeled as a random variable. It has a fixed functional form and known coefficients. This uncertainty cannot be reduced. (ii) An epistemic uncertainty modeled as a fixed but poorly known physical quantity that lies within a given interval. This uncertainty is reducible. (iii) A parameter might be aleatory but sufficient data might not be available to adequately model it as a single random variable. For example, the parameters of a normal variable, e.g., the mean and standard deviation, might not be precisely known but could be assumed to lie within some intervals. It results in a distributional p-box having the physical parameter with an aleatory uncertainty, but the parameters prescribing its mathematical model are subjected to epistemic uncertainties. Each of the parameters of the random variable is an unknown element of a known interval. This uncertainty is reducible. From the study, it is observed that due to practical limitations or computational expense, the sampling is not exhaustive in sampling-based methodology. That is why the sampling-based methodology has high probability of underestimating the output bounds. Therefore, an optimization-based strategy to convert uncertainty described by interval data into a probabilistic framework is necessary. This is achieved in this study by using PBO.

Keywords: aleatory uncertainty, epistemic uncertainty, first order error analysis, uncertainty quantification, percentile-based optimization

Procedia PDF Downloads 238
82 A Digital Twin Approach to Support Real-time Situational Awareness and Intelligent Cyber-physical Control in Energy Smart Buildings

Authors: Haowen Xu, Xiaobing Liu, Jin Dong, Jianming Lian

Abstract:

Emerging smart buildings often employ cyberinfrastructure, cyber-physical systems, and Internet of Things (IoT) technologies to increase the automation and responsiveness of building operations for better energy efficiency and lower carbon emission. These operations include the control of Heating, Ventilation, and Air Conditioning (HVAC) and lighting systems, which are often considered a major source of energy consumption in both commercial and residential buildings. Developing energy-saving control models for optimizing HVAC operations usually requires the collection of high-quality instrumental data from iterations of in-situ building experiments, which can be time-consuming and labor-intensive. This abstract describes a digital twin approach to automate building energy experiments for optimizing HVAC operations through the design and development of an adaptive web-based platform. The platform is created to enable (a) automated data acquisition from a variety of IoT-connected HVAC instruments, (b) real-time situational awareness through domain-based visualizations, (c) adaption of HVAC optimization algorithms based on experimental data, (d) sharing of experimental data and model predictive controls through web services, and (e) cyber-physical control of individual instruments in the HVAC system using outputs from different optimization algorithms. Through the digital twin approach, we aim to replicate a real-world building and its HVAC systems in an online computing environment to automate the development of building-specific model predictive controls and collaborative experiments in buildings located in different climate zones in the United States. We present two case studies to demonstrate our platform’s capability for real-time situational awareness and cyber-physical control of the HVAC in the flexible research platforms within the Oak Ridge National Laboratory (ORNL) main campus. Our platform is developed using adaptive and flexible architecture design, rendering the platform generalizable and extendable to support HVAC optimization experiments in different types of buildings across the nation.

Keywords: energy-saving buildings, digital twins, HVAC, cyber-physical system, BIM

Procedia PDF Downloads 107
81 Micro-Milling Process Development of Advanced Materials

Authors: M. A. Hafiz, P. T. Matevenga

Abstract:

Micro-level machining of metals is a developing field which has shown to be a prospective approach to produce features on the parts in the range of a few to a few hundred microns with acceptable machining quality. It is known that the mechanics (i.e. the material removal mechanism) of micro-machining and conventional machining have significant differences due to the scaling effects associated with tool-geometry, tool material and work piece material characteristics. Shape memory alloys (SMAs) are those metal alloys which display two exceptional properties, pseudoelasticity and the shape memory effect (SME). Nickel-titanium (NiTi) alloys are one of those unique metal alloys. NiTi alloys are known to be difficult-to-cut materials specifically by using conventional machining techniques due to their explicit properties. Their high ductility, high amount of strain hardening, and unusual stress–strain behaviour are the main properties accountable for their poor machinability in terms of tool wear and work piece quality. The motivation of this research work was to address the challenges and issues of micro-machining combining with those of machining of NiTi alloy which can affect the desired performance level of machining outputs. To explore the significance of range of cutting conditions on surface roughness and tool wear, machining tests were conducted on NiTi. Influence of different cutting conditions and cutting tools on surface and sub-surface deformation in work piece was investigated. Design of experiments strategy (L9 Array) was applied to determine the key process variables. The dominant cutting parameters were determined by analysis of variance. These findings showed that feed rate was the dominant factor on surface roughness whereas depth of cut found to be dominant factor as far as tool wear was concerned. The lowest surface roughness was achieved at the feed rate of equal to the cutting edge radius where as the lowest flank wear was observed at lowest depth of cut. Repeated machining trials have yet to be carried out in order to observe the tool life, sub-surface deformation and strain induced hardening which are also expecting to be amongst the critical issues in micro machining of NiTi. The machining performance using different cutting fluids and strategies have yet to be studied.

Keywords: nickel titanium, micro-machining, surface roughness, machinability

Procedia PDF Downloads 339
80 The Impact of Climate Change on Typical Material Degradation Criteria over Timurid Historical Heritage

Authors: Hamed Hedayatnia, Nathan Van Den Bossche

Abstract:

Understanding the ways in which climate change accelerates or slows down the process of material deterioration is the first step towards assessing adaptive approaches for the conservation of historical heritage. Analysis of the climate change effects on the degradation risk assessment parameters like freeze-thaw cycles and wind erosion is also a key parameter when considering mitigating actions. Due to the vulnerability of cultural heritage to climate change, the impact of this phenomenon on material degradation criteria with the focus on brick masonry walls in Timurid heritage, located in Iran, was studied. The Timurids were the final great dynasty to emerge from the Central Asian steppe. Through their patronage, the eastern Islamic world in northwestern of Iran, especially in Mashhad and Herat, became a prominent cultural center. Goharshad Mosque is a mosque in Mashhad of the Razavi Khorasan Province, Iran. It was built by order of Empress Goharshad, the wife of Shah Rukh of the Timurid dynasty in 1418 CE. Choosing an appropriate regional climate model was the first step. The outputs of two different climate model: the 'ALARO-0' and 'REMO,' were analyzed to find out which model is more adopted to the area. For validating the quality of the models, a comparison between model data and observations was done in 4 different climate zones in Iran for a period of 30 years. The impacts of the projected climate change were evaluated until 2100. To determine the material specification of Timurid bricks, standard brick samples from a Timurid mosque were studied. Determination of water absorption coefficient, defining the diffusion properties and determination of real density, and total porosity tests were performed to characterize the specifications of brick masonry walls, which is needed for running HAM-simulations. Results from the analysis showed that the threatening factors in each climate zone are almost different, but the most effective factor around Iran is the extreme temperature increase and erosion. In the north-western region of Iran, one of the key factors is wind erosion. In the north, rainfall erosion and mold growth risk are the key factors. In the north-eastern part, in which our case study is located, the important parameter is wind erosion.

Keywords: brick, climate change, degradation criteria, heritage, Timurid period

Procedia PDF Downloads 118
79 Assessing the Effectiveness of Warehousing Facility Management: The Case of Mantrac Ghana Limited

Authors: Kuhorfah Emmanuel Mawuli

Abstract:

Generally, for firms to enhance their operational efficiency of logistics, it is imperative to assess the logistics function. The cost of logistics conventionally represents a key consideration in the pricing decisions of firms, which suggests that cost efficiency in logistics can go a long way to improve margins. Warehousing, which is a key part of logistics operations, has the prospect of influencing operational efficiency in logistics management as well as customer value, but this potential has often not been recognized. It has been found that there is a paucity of research that evaluates the efficiency of warehouses. Indeed, limited research has been conducted to examine potential barriers to effective warehousing management. Due to this paucity of research, there is limited knowledge on how to address the obstacles associated with warehousing management. In order for warehousing management to become profitable, there is the need to integrate, balance, and manage the economic inputs and outputs of the entire warehouse operations, something that many firms tend to ignore. Management of warehousing is not solely related to storage functions. Instead, effective warehousing management requires such practices as maximum possible mechanization and automation of operations, optimal use of space and capacity of storage facilities, organization through "continuous flow" of goods, a planned system of storage operations, and safety of goods. For example, there is an important need for space utilization of the warehouse surface as it is a good way to evaluate the storing operation and pick items per hour. In the setting of Mantrac Ghana, not much knowledge regarding the management of the warehouses exists. The researcher has personally observed many gaps in the management of the warehouse facilities in the case organization Mantrac Ghana. It is important, therefore, to assess the warehouse facility management of the case company with the objective of identifying weaknesses for improvement. The study employs an in-depth qualitative research approach using interviews as a mode of data collection. Respondents in the study mainly comprised warehouse facility managers in the studied company. A total of 10 participants were selected for the study using a purposive sampling strategy. Results emanating from the study demonstrate limited warehousing effectiveness in the case company. Findings further reveal that the major barriers to effective warehousing facility management comprise poor layout, poor picking optimization, labour costs, and inaccurate orders; policy implications of the study findings are finally outlined.

Keywords: assessing, warehousing, facility, management

Procedia PDF Downloads 64
78 The Processing of Implicit Stereotypes in Contexts of Reading, Using Eye-Tracking and Self-Paced Reading Tasks

Authors: Magali Mari, Misha Muller

Abstract:

The present study’s objectives were to determine how diverse implicit stereotypes affect the processing of written information and linguistic inferential processes, such as presupposition accommodation. When reading a text, one constructs a representation of the described situation, which is then updated, according to new outputs and based on stereotypes inscribed within society. If the new output contradicts stereotypical expectations, the representation must be corrected, resulting in longer reading times. A similar process occurs in cases of linguistic inferential processes like presupposition accommodation. Presupposition accommodation is traditionally regarded as fast, automatic processing of background information (e.g., ‘Mary stopped eating meat’ is quickly processed as Mary used to eat meat). However, very few accounts have investigated if this process is likely to be influenced by domains of social cognition, such as implicit stereotypes. To study the effects of implicit stereotypes on presupposition accommodation, adults were recorded while they read sentences in French, combining two methods, an eye-tracking task and a classic self-paced reading task (where participants read sentence segments at their own pace by pressing a computer key). In one condition, presuppositions were activated with the French definite articles ‘le/la/les,’ whereas in the other condition, the French indefinite articles ‘un/une/des’ was used, triggering no presupposition. Using a definite article presupposes that the object has already been uttered and is thus part of background information, whereas using an indefinite article is understood as the introduction of new information. Two types of stereotypes were under examination in order to enlarge the scope of stereotypes traditionally analyzed. Study 1 investigated gender stereotypes linked to professional occupations to replicate previous findings. Study 2 focused on nationality-related stereotypes (e.g. ‘the French are seducers’ versus ‘the Japanese are seducers’) to determine if the effects of implicit stereotypes on reading are generalizable to other types of implicit stereotypes. The results show that reading is influenced by the two types of implicit stereotypes; in the two studies, the reading pace slowed down when a counter-stereotype was presented. However, presupposition accommodation did not affect participants’ processing of information. Altogether these results show that (a) implicit stereotypes affect the processing of written information, regardless of the type of stereotypes presented, and (b) that implicit stereotypes prevail over the superficial linguistic treatment of presuppositions, which suggests faster processing for treating social information compared to linguistic information.

Keywords: eye-tracking, implicit stereotypes, reading, social cognition

Procedia PDF Downloads 196
77 Water Supply and Demand Analysis for Ranchi City under Climate Change Using Water Evaluation and Planning System Model

Authors: Pappu Kumar, Ajai Singh, Anshuman Singh

Abstract:

There are different water user sectors such as rural, urban, mining, subsistence and commercial irrigated agriculture, commercial forestry, industry, power generation which are present in the catchment in Subarnarekha River Basin and Ranchi city. There is an inequity issue in the access to water. The development of the rural area, construction of new power generation plants, along with the population growth, the requirement of unmet water demand and the consideration of environmental flows, the revitalization of small-scale irrigation schemes is going to increase the water demands in almost all the water-stressed catchment. The WEAP Model was developed by the Stockholm Environment Institute (SEI) to enable evaluation of planning and management issues associated with water resources development. The WEAP model can be used for both urban and rural areas and can address a wide range of issues including sectoral demand analyses, water conservation, water rights and allocation priorities, river flow simulation, reservoir operation, ecosystem requirements and project cost-benefit analyses. This model is a tool for integrated water resource management and planning like, forecasting water demand, supply, inflows, outflows, water use, reuse, water quality, priority areas and Hydropower generation, In the present study, efforts have been made to access the utility of the WEAP model for water supply and demand analysis for Ranchi city. A detailed works have been carried out and it was tried to ascertain that the WEAP model used for generating different scenario of water requirement, which could help for the future planning of water. The water supplied to Ranchi city was mostly contributed by our study river, Hatiya reservoir and ground water. Data was collected from various agencies like PHE Ranchi, census data of 2011, Doranda reservoir and meteorology department etc. This collected and generated data was given as input to the WEAP model. The model generated the trends for discharge of our study river up to next 2050 and same time also generated scenarios calculating our demand and supplies for feature. The results generated from the model outputs predicting the water require 12 million litter. The results will help in drafting policies for future regarding water supplies and demands under changing climatic scenarios.

Keywords: WEAP model, water demand analysis, Ranchi, scenarios

Procedia PDF Downloads 417
76 The Algerian Experience in Developing Higher Education in the Country in Light of Modern Technology: Challenges and Prospects

Authors: Mohammed Messaoudi

Abstract:

The higher education sector in Algeria has witnessed in recent years a remarkable transformation, as it witnessed the integration of institutions within the modern technological environment and harnessing all appropriate mechanisms to raise the level of education and the level of training. Observers and those interested that it is necessary for the Algerian university to enter this field, especially with the efforts that seek to employ modern technology in the sector and encourage investment in this field, in addition to the state’s keenness to move towards building a path to benefit from modern technology, and to encourage energies in light of a reality that carries many Aspirations and challenges by achieving openness to the new digital environment and keeping pace with the ranks of international universities. Higher education is one of the engines of development for societies, as it is a vital field for the transfer of knowledge and scientific expertise, and the university is at the top of the comprehensive educational system for various disciplines in light of the achievement of a multi-dimensional educational system, and amid the integration of three basic axes that establish the sound educational process (teaching, research, relevant outputs efficiency), and according to a clear strategy that monitors the advancement of academic work, and works on developing its future directions to achieve development in this field. The Algerian University is considered one of the service institutions that seeks to find the optimal mechanisms to keep pace with the changes of the times, as it has become necessary for the university to enter the technological space and thus ensure the quality of education in it and achieve the required empowerment by dedicating a structure that matches the requirements of the challenges on which the sector is based, amid unremitting efforts to develop the capabilities. He sought to harness the mechanisms of communication and information technology and achieve transformation at the level of the higher education sector with what is called higher education technology. The conceptual framework of information and communication technology at the level of higher education institutions in Algeria is determined through the factors of organization, factors of higher education institutions, characteristics of the professor, characteristics of students, the outcomes of the educational process, and there is a relentless pursuit to achieve a positive interaction between these axes as they are basic components on which the success and achievement of higher education are based for his goals.

Keywords: Information and communication technology, Algerian university, scientific and cognitive development, challenges

Procedia PDF Downloads 81
75 Efficiency and Equity in Italian Secondary School

Authors: Giorgia Zotti

Abstract:

This research comprehensively investigates the multifaceted interplay determining school performance, individual backgrounds, and regional disparities within the landscape of Italian secondary education. Leveraging data gleaned from the INVALSI 2021-2022 database, the analysis meticulously scrutinizes two fundamental distributions of educational achievements: the standardized Invalsi test scores and official grades in Italian and Mathematics, focusing specifically on final-year secondary school students in Italy. Applying a comprehensive methodology, the study initially employs Data Envelopment Analysis (DEA) to assess school performances. This methodology involves constructing a production function encompassing inputs (hours spent at school) and outputs (Invalsi scores in Italian and Mathematics, along with official grades in Italian and Math). The DEA approach is applied in both of its versions: traditional and conditional. The latter incorporates environmental variables such as school type, size, demographics, technological resources, and socio-economic indicators. Additionally, the analysis delves into regional disparities by leveraging the Theil Index, providing insights into disparities within and between regions. Moreover, in the frame of the inequality of opportunity theory, the study quantifies the inequality of opportunity in students' educational achievements. The methodology applied is the Parametric Approach in the ex-ante version, considering diverse circumstances like parental education and occupation, gender, school region, birthplace, and language spoken at home. Consequently, a Shapley decomposition is applied to understand how much each circumstance affects the outcomes. The outcomes of this comprehensive investigation unveil pivotal determinants of school performance, notably highlighting the influence of school type (Liceo) and socioeconomic status. The research unveils regional disparities, elucidating instances where specific schools outperform others in official grades compared to Invalsi scores, shedding light on the intricate nature of regional educational inequalities. Furthermore, it emphasizes a heightened inequality of opportunity within the distribution of Invalsi test scores in contrast to official grades, underscoring pronounced disparities at the student level. This analysis provides insights for policymakers, educators, and stakeholders, fostering a nuanced understanding of the complexities within Italian secondary education.

Keywords: inequality, education, efficiency, DEA approach

Procedia PDF Downloads 75
74 The Grand Egyptian Museum as a Cultural Interface

Authors: Mahmoud Moawad Mohamed Osman

Abstract:

The Egyptian civilization was and still is an inspiration for many human civilizations and modern sciences. For this reason, there is still a passion for the ancient Egyptian civilization. Due to the breadth and abundance of the outputs of the ancient Egyptian civilization, many museums have been established that contribute to displaying and demonstrating the splendor of the ancient Egyptian civilization, and among those museums is the Grand Egyptian Museum (Egypt's gift to the whole world). The idea of establishing the Grand Egyptian Museum began in the nineties of the last century, and in 2002 the foundation stone was laid for the museum project to be built in a privileged location overlooking the eternal pyramids of Giza, where the Egyptian state was declared, and under the auspices of the United Nations Educational, Scientific and Cultural Organization (UNESCO) and the International Union of Architects. , for an international architectural competition for the best design for the museum. The current design submitted by Heneghan Peng Architects in Ireland won, and its design was based on the rays of the sun extending from the tops of the three pyramids when they meet to represent a conical mass, which is the Grand Egyptian Museum. The construction of the museum project began in May 2005, when the site was paved and prepared, and in 2006, the largest antiquities restoration center in the Middle East was established, dedicated to the restoration, preservation, maintenance and rehabilitation of the antiquities scheduled to be displayed in the museum halls, which was opened in 2010. The construction of the museum building, which has an area of more than 300,000 square meters, was completed during the year 2021, and includes a number of exhibition halls, each of which is considered larger than many current museums in Egypt and the world. The museum is considered one of the most important and greatest achievements of modern Egypt. It was created to be an integrated global civilizational, cultural and entertainment edifice, and to be the first destination for everyone interested in ancient Egyptian heritage, as the largest museum in the world that tells the story of the history of ancient Egyptian civilization, as it contains a large number of distinctive and unique artifacts, including the treasures of the golden king Tutankhamun, which... It is displayed for the first time in its entirety since the discovery of his tomb in November 1922, in addition to the collection of Queen Hetepheres, the guard of the mother of King Khufu, the builder of the Great Pyramid in Giza, as well as the Museum of King Khufu’s Boats, as well as various archaeological collectibles from the pre-dynastic era until the Greek and Roman eras.

Keywords: grand egyptian museum, egyptian civilization, education, museology

Procedia PDF Downloads 44
73 Exploration of Building Information Modelling Software to Develop Modular Coordination Design Tool for Architects

Authors: Muhammad Khairi bin Sulaiman

Abstract:

The utilization of Building Information Modelling (BIM) in the construction industry has provided an opportunity for designers in the Architecture, Engineering and Construction (AEC) industry to proceed from the conventional method of using manual drafting to a way that creates alternative designs quickly, produces more accurate, reliable and consistent outputs. By using BIM Software, designers can create digital content that manipulates the use of data using the parametric model of BIM. With BIM software, more alternative designs can be created quickly and design problems can be explored further to produce a better design faster than conventional design methods. Generally, BIM is used as a documentation mechanism and has not been fully explored and utilised its capabilities as a design tool. Relative to the current issue, Modular Coordination (MC) design as a sustainable design practice is encouraged since MC design will reduce material wastage through standard dimensioning, pre-fabrication, repetitive, modular construction and components. However, MC design involves a complex process of rules and dimensions. Therefore, a tool is needed to make this process easier. Since the parameters in BIM can easily be manipulated to follow MC rules and dimensioning, thus, the integration of BIM software with MC design is proposed for architects during the design stage. With this tool, there will be an improvement in acceptance and practice in the application of MC design effectively. Consequently, this study will analyse and explore the function and customization of BIM objects and the capability of BIM software to expedite the application of MC design during the design stage for architects. With this application, architects will be able to create building models and locate objects within reference modular grids that adhere to MC rules and dimensions. The parametric modeling capabilities of BIM will also act as a visual tool that will further enhance the automation of the 3-Dimensional space planning modeling process. (Method) The study will first analyze and explore the parametric modeling capabilities of rule-based BIM objects, which eventually customize a reference grid within the rules and dimensioning of MC. Eventually, the approach will further enhance the architect's overall design process and enable architects to automate complex modeling, which was nearly impossible before. A prototype using a residential quarter will be modeled. A set of reference grids guided by specific MC rules and dimensions will be used to develop a variety of space planning and configuration. With the use of the design, the tool will expedite the design process and encourage the use of MC Design in the construction industry.

Keywords: building information modeling, modular coordination, space planning, customization, BIM application, MC space planning

Procedia PDF Downloads 83