Search results for: voyage related operational energy Efficiency measures
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 24448

Search results for: voyage related operational energy Efficiency measures

2008 Advanced Magnetic Field Mapping Utilizing Vertically Integrated Deployment Platforms

Authors: John E. Foley, Martin Miele, Raul Fonda, Jon Jacobson

Abstract:

This paper presents development and implementation of new and innovative data collection and analysis methodologies based on deployment of total field magnetometer arrays. Our research has focused on the development of a vertically-integrated suite of platforms all utilizing common data acquisition, data processing and analysis tools. These survey platforms include low-altitude helicopters and ground-based vehicles, including robots, for terrestrial mapping applications. For marine settings the sensor arrays are deployed from either a hydrodynamic bottom-following wing towed from a surface vessel or from a towed floating platform for shallow-water settings. Additionally, sensor arrays are deployed from tethered remotely operated vehicles (ROVs) for underwater settings where high maneuverability is required. While the primary application of these systems is the detection and mapping of unexploded ordnance (UXO), these system are also used for various infrastructure mapping and geologic investigations. For each application, success is driven by the integration of magnetometer arrays, accurate geo-positioning, system noise mitigation, and stable deployment of the system in appropriate proximity of expected targets or features. Each of the systems collects geo-registered data compatible with a web-enabled data management system providing immediate access of data and meta-data for remote processing, analysis and delivery of results. This approach allows highly sophisticated magnetic processing methods, including classification based on dipole modeling and remanent magnetization, to be efficiently applied to many projects. This paper also briefly describes the initial development of magnetometer-based detection systems deployed from low-altitude helicopter platforms and the subsequent successful transition of this technology to the marine environment. Additionally, we present examples from a range of terrestrial and marine settings as well as ongoing research efforts related to sensor miniaturization for unmanned aerial vehicle (UAV) magnetic field mapping applications.

Keywords: dipole modeling, magnetometer mapping systems, sub-surface infrastructure mapping, unexploded ordnance detection

Procedia PDF Downloads 453
2007 Brand Equity Tourism Destinations: An Application in Wine Regions Comparing Visitors' and Managers' Perspectives

Authors: M. Gomez, A. Molina

Abstract:

The concept of brand equity in the wine tourism area is an interesting topic to explore the factors that determine it. The aim of this study is to address this gap by investigating wine tourism destinations brand equity, and understanding the impact that the denomination of origin (DO) brand image and the destination image have on brand equity. Managing and monitoring the branding of wine tourism destinations is crucial to attract tourist arrivals. The multiplicity of stakeholders involved in the branding process calls for research that, unlike previous studies, adopts a broader perspective and incorporates an internal and an external perspective. Therefore, this gap by comparing managers’ and visitors’ approaches to wine tourism destination brand equity has been addressed. A survey questionnaire for data collection purposes was used. The hypotheses were tested using winery managers and winery visitors, each leading a different position relative to the wine tourism destination brand equity. All the interviews were conducted face-to-face. The survey instrument included several scales related to DO brand image, destination image, and wine tourism destination brand equity. All items were measured on seven-point Likert scales. Partial least squares was used to analyze the accuracy of scales, the structural model, and multi-group analysis to identify the differences in the path coefficients and to test the hypotheses. The results show that the positive influence of DO brand image on wine tourism destination brand equity is stronger for wineries than for visitors, but there are no significant differences between the two groups. However, there are significant differences in the positive effect of destination brand image on both wine tourism destination brand equity and DO brand image. The results of this study are important for consultants, practitioners, and policy makers. The gap between managers and visitors calls for the development of a number of campaigns to enhance the image that visitors hold and, thus, increase tourist arrivals. Events such as wine gatherings and gastronomic symposiums held at universities and culinary schools and participation in business meetings can enhance the perceptions and in turn, the added value, brand equity of the wine tourism destinations. The images of destinations and DOs can help strengthen the brand equity of the wine tourism destinations, especially for visitors. Thus, the development and reinforcement of favorable, strong, and unique destination associations and DO associations are important to increase that value. Joint campaigns are advisable to enhance the images of destinations and DOs and, as a consequence, the value of the wine tourism destination brand.

Keywords: brand equity, managers, visitors, wine tourism

Procedia PDF Downloads 120
2006 The Effects of Addition of Chloride Ions on the Properties of ZnO Nanostructures Grown by Electrochemical Deposition

Authors: L. Mentar, O. Baka, A. Azizi

Abstract:

Zinc oxide as a wide band semiconductor materials, especially nanostructured materials, have potential applications in large-area such as electronics, sensors, photovoltaic cells, photonics, optical devices and optoelectronics due to their unique electrical and optical properties and surface properties. The feasibility of ZnO for these applications is due to the successful synthesis of diverse ZnO nanostructures, including nanorings, nanobows, nanohelixes, nanosprings, nanobelts, nanotubes, nanopropellers, nanodisks, and nanocombs, by different method. Among various synthesis methods, electrochemical deposition represents a simple and inexpensive solution based method for synthesis of semiconductor nanostructures. In this study, the electrodeposition method was used to produce zinc oxide (ZnO) nanostructures on fluorine-doped tin oxide (FTO)-coated conducting glass substrate as TCO from chloride bath. We present a systematic study on the effects of the concentration of chloride anion on the properties of ZnO. The influence of KCl concentrations on the electrodeposition process, morphological, structural and optical properties of ZnO nanostructures was examined. In this research electrochemical deposition of ZnO nanostructures is investigated using conventional electrochemical measurements (cyclic voltammetry and Mott-Schottky), scanning electron microscopy (SEM), and X-ray diffraction (XRD) techniques. The potentials of electrodeposition of ZnO were determined using the cyclic voltammetry. From the Mott-Schottky measurements, the flat-band potential and the donor density for the ZnO nanostructure are determined. SEM images shows different size and morphology of the nanostructures and depends greatly on the KCl concentrations. The morphology of ZnO nanostructures is determined by the corporated action between [Zn(NO3)2] and [Cl-].Very netted hexagonal grains are observed for the nanostructures deposited at 0.1M of KCl. XRD studies revealed that the all deposited films were polycrystalline in nature with wurtzite phase. The electrodeposited thin films are found to have preferred oriented along (002) plane of the wurtzite structure of ZnO with c-axis normal to the substrate surface for sample at different concentrations of KCl. UV-Visible spectra showed a significant optical transmission (~80%), which decreased with low Cl-1 concentrations. The energy band gap values have been estimated to be between 3.52 and 3.80 eV.

Keywords: electrodeposition, ZnO, chloride ions, Mott-Schottky, SEM, XRD

Procedia PDF Downloads 277
2005 Examining the Effects of Exercise and Healthy Diet on Certain Blood Parameter Levels, Oxidative Stress and Anthropometric Measurements in Slightly Overweight Women

Authors: Nezihe Şengün, Ragip Pala

Abstract:

To prevent overweight and obesity, individuals need to consume food and beverages according to their nutritional needs, engage in regular exercises, and regularly monitor their body weight. This study aimed to examine the effects of exercise, diet, or combined intervention on changes in blood lipid parameters (total cholesterol, LDL cholesterol, HDL cholesterol, and triglycerides) and the level of malondialdehyde (MDA), a marker of oxidative stress, in parallel with the increase in body weight due to poor nutrition and sedentary lifestyle conditions. The study included a total of 48 female students aged 18-28 years with a BMI between 25.0 and 29.9 kg/m². They were divided into four groups: control (C), exercise (Ex), diet (D), and exercise+diet (Ex+D). Those in the exercise groups received aerobic exercises at 60-70% intensity (10 minutes warm-up, 30 minutes running, 10 minutes cool-down), while those in the diet groups were provided with a diet program based on the calculation of energy needs considering basal metabolic rate, physical activity level, age, and BMI. The students’ body weight, body fat mass, Body Mass Index (BMI), and waist-hip ratios were measured at the beginning (day 1) and end (day 60) of the 8-week intervention period. Their total cholesterol, HDL cholesterol, LDL cholesterol, triglycerides, and MDA levels were evaluated and analyzed, considering a statistical significance level of p<0.05. As a result, female students in the Ex+D group had the largest difference in body weight, body fat mass, BMI, and waist-hip ratios, and this difference was statistically significant. Except for those in the C group, those in the other groups experienced a decrease in their total cholesterol, LDL cholesterol, and triglyceride levels and an increase in their HDL cholesterol levels. The decrease in total cholesterol, LDL cholesterol, and triglyceride levels was statistically significant for those in the D group, and the increase in HDL cholesterol level was statistically significant for those in the Ex+D group (p<0.05). A decrease in MDA level was found in all groups except those in the C group, and this decrease was significantly higher in the Ex group. In conclusion, our study revealed that the most effective way to achieve weight loss is through a combination of exercise and diet. The application of Ex+D is considered to balance blood lipid levels and suppress oxidative stress.

Keywords: obesity, exercise, diet, body mass index, blood lipids

Procedia PDF Downloads 65
2004 Structuring Highly Iterative Product Development Projects by Using Agile-Indicators

Authors: Guenther Schuh, Michael Riesener, Frederic Diels

Abstract:

Nowadays, manufacturing companies are faced with the challenge of meeting heterogeneous customer requirements in short product life cycles with a variety of product functions. So far, some of the functional requirements remain unknown until late stages of the product development. A way to handle these uncertainties is the highly iterative product development (HIP) approach. By structuring the development project as a highly iterative process, this method provides customer oriented and marketable products. There are first approaches for combined, hybrid models comprising deterministic-normative methods like the Stage-Gate process and empirical-adaptive development methods like SCRUM on a project management level. However, almost unconsidered is the question, which development scopes can preferably be realized with either empirical-adaptive or deterministic-normative approaches. In this context, a development scope constitutes a self-contained section of the overall development objective. Therefore, this paper focuses on a methodology that deals with the uncertainty of requirements within the early development stages and the corresponding selection of the most appropriate development approach. For this purpose, internal influencing factors like a company’s technology ability, the prototype manufacturability and the potential solution space as well as external factors like the market accuracy, relevance and volatility will be analyzed and combined into an Agile-Indicator. The Agile-Indicator is derived in three steps. First of all, it is necessary to rate each internal and external factor in terms of the importance for the overall development task. Secondly, each requirement has to be evaluated for every single internal and external factor appropriate to their suitability for empirical-adaptive development. Finally, the total sums of internal and external side are composed in the Agile-Indicator. Thus, the Agile-Indicator constitutes a company-specific and application-related criterion, on which the allocation of empirical-adaptive and deterministic-normative development scopes can be made. In a last step, this indicator will be used for a specific clustering of development scopes by application of the fuzzy c-means (FCM) clustering algorithm. The FCM-method determines sub-clusters within functional clusters based on the empirical-adaptive environmental impact of the Agile-Indicator. By means of the methodology presented in this paper, it is possible to classify requirements, which are uncertainly carried out by the market, into empirical-adaptive or deterministic-normative development scopes.

Keywords: agile, highly iterative development, agile-indicator, product development

Procedia PDF Downloads 229
2003 Safety Climate Assessment and Its Impact on the Productivity of Construction Enterprises

Authors: Krzysztof J. Czarnocki, F. Silveira, E. Czarnocka, K. Szaniawska

Abstract:

Research background: Problems related to the occupational health and decreasing level of safety occur commonly in the construction industry. Important factor in the occupational safety in construction industry is scaffold use. All scaffolds used in construction, renovation, and demolition shall be erected, dismantled and maintained in accordance with safety procedure. Increasing demand for new construction projects unfortunately still is linked to high level of occupational accidents. Therefore, it is crucial to implement concrete actions while dealing with scaffolds and risk assessment in construction industry, the way on doing assessment and liability of assessment is critical for both construction workers and regulatory framework. Unfortunately, professionals, who tend to rely heavily on their own experience and knowledge when taking decisions regarding risk assessment, may show lack of reliability in checking the results of decisions taken. Purpose of the article: The aim was to indicate crucial parameters that could be modeling with Risk Assessment Model (RAM) use for improving both building enterprise productivity and/or developing potential and safety climate. The developed RAM could be a benefit for predicting high-risk construction activities and thus preventing accidents occurred based on a set of historical accident data. Methodology/Methods: A RAM has been developed for assessing risk levels as various construction process stages with various work trades impacting different spheres of enterprise activity. This project includes research carried out by teams of researchers on over 60 construction sites in Poland and Portugal, under which over 450 individual research cycles were carried out. The conducted research trials included variable conditions of employee exposure to harmful physical and chemical factors, variable levels of stress of employees and differences in behaviors and habits of staff. Genetic modeling tool has been used for developing the RAM. Findings and value added: Common types of trades, accidents, and accident causes have been explored, in addition to suitable risk assessment methods and criteria. We have found that the initial worker stress level is more direct predictor for developing the unsafe chain leading to the accident rather than the workload, or concentration of harmful factors at the workplace or even training frequency and management involvement.

Keywords: safety climate, occupational health, civil engineering, productivity

Procedia PDF Downloads 294
2002 Biotechnology Approach: A Tool of Enhancement of Sticky Mucilage of Pulicaria Incisa (Medicinal Plant) for Wounds Treatment

Authors: Djamila Chabane, Asma Rouane, Karim Arab

Abstract:

Depending of the chemical substances responsible for the pharmacological effects, a future therapeutic drug might be produced by extraction from whole plants or by callus initiated from some parts. The optimized callus culture protocols now offer the possibility to use cell culture techniques for vegetative propagation and open minds for further studies on secondary metabolites and drug establishment. In Algerian traditional medicine, Pulicaria incisa (Asteraceae) is used in the treatment of daily troubles (stomachache, headhache., cold, sore throat and rheumatic arthralgia). Field findings revealed that many healers use some fresh parts (leaves, flowers) of this plant to treat skin wounds. This study aims to evaluate the healing efficiency of artisanal cream prepared from sticky mucilage isolated from calluses on dermal wounds of animal models. Callus cultures were initiated from reproductive explants (young inflorescences) excised from adult plants and transferred to a MS basal medium supplemented with growth regulators and maintained under dark for for months. Many calluses types were obtained with various color and aspect (friable, compact). Several subcultures of calli were performed to enhance the mucilage accumulation. After extraction, the mucilage extracts were tested on animal models as follows. The wound healing potential was studied by causing dermal wounds (1 cm diameter) at the dorsolumbar part of Rattus norvegicus; different samples of the cream were applied after hair removal on three rats each, including two controls (one treated by Vaseline and one without any treatment), two experimental groups (experimental group 1, treated with a reference ointment "Madecassol® and experimental group 2 treated by callus mucilage cream for a period of seventeen days. The evolution of the healing activity was estimated by calculating the percentage reduction of the area wounds treated by all compounds tested compared to the controls by using AutoCAD software. The percentage of healing effect of the cream prepared from callus mucilage was (99.79%) compared to that of Madecassol® (99.76%). For the treatment time, the significant healing activity was observed after 17 days compared to that of the reference pharmaceutical products without any wound infection. The healing effect of Madecassol® is more effective because it stimulates and regulates the production of collagen, a fibrous matrix essential for wound healing. Mucilage extracts also showed a high capacity to heal the skin without any infection. According to this pharmacological activity, we suggest to use calluses produced by in vitro culture to producing new compounds for the skin care and treatment.

Keywords: calluses, Pulicaria incisa, mucilage, Wounds

Procedia PDF Downloads 112
2001 A Preliminary Randomized Controlled Trial of Pure L-Ascorbic Acid with Using a Needle-Free and Micro-Needle Mesotherapy in Treatment of Anti-Aging Procedure

Authors: M. Zasada, A. Markiewicz, A. Erkiert-Polguj, E. Budzisz

Abstract:

The epidermis is a keratinized stratified squamous epithelium covered by the hydro-lipid barrier. Therefore, active substances should be able to penetrate through this hydro-lipid coating. L-ascorbic acid is one of the vitamins which plays an important role in stimulation fibroblast to produce collagen type I and in hyperpigmentation lightening. Vitamin C is a water-soluble antioxidant, which protects skin from oxidation damage and rejuvenates photoaged skin. No-needle mesotherapy is a non-invasive rejuvenation technique depending on electric pulses, electroporation, and ultrasounds. These physicals factors result in deeper penetration of cosmetics. It is important to increase the penetration of L-ascorbic acid, thereby increasing the spectrum of its activity. The aim of the work was to assess the effectiveness of pure L-ascorbic acid activity in anti-aging therapy using a needle-free and micro-needling mesotherapy. The study was performed on a group of 35 healthy volunteers in accordance with the Declaration of Helsinki of 1964 and agreement of the Ethics Commissions no RNN/281/16/KE 2017. Women were randomized to mesotherapy or control group. Control group applied topically 2,5 ml serum containing 20% L-ascorbic acid with hydrate from strawberries, every 10 days for a period of 9 weeks. No-needle mesotherapy, on the left half of the face and micro-needling on the right with the same serum, was done in mesotherapy group. The pH of serum was 3.5-4, and the serum was prepared directly prior to the facial treatment. The skin parameters were measured at the beginning and before each treatment. The measurement of the forehead skin was done using Cutometer® (measurement of skin elasticity and firmness), Corneometer® (skin hydration measurement), Mexameter® (skin tone measurement). Also, the photographs were taken by Fotomedicus system. Additionally, the volunteers fulfilled the questionnaire. Serum was tested for microbiological purity and stability after the opening of the cosmetic. During the study, all of the volunteers were taken care of a dermatologist. The regular application of the serum has caused improvement of the skin parameters. Respectively, after 4 and 8 weeks improvement in hydration and elasticity has been seen (Corneometer®, Cutometer® results). Moreover, the number of hyper-pigmentated spots has decreased (Mexameter®). After 8 weeks the volunteers has claimed that the tested product has smoothing and moisturizing features. Subjective opinions indicted significant improvement of skin color and elasticity. The product containing the L-ascorbic acid used with intercellular penetration promoters demonstrates higher anti-aging efficiency than control. In vivo studies confirmed the effectiveness of serum and the impact of the active substance on skin firmness and elasticity, the degree of hydration and skin tone. Mesotherapy with pure L-ascorbic acid provides better diffusion of active substances through the skin.

Keywords: anti-aging, l-ascorbic acid, mesotherapy, promoters

Procedia PDF Downloads 253
2000 Synthesis of Porphyrin-Functionalized Beads for Flow Cytometry

Authors: William E. Bauta, Jennifer Rebeles, Reggie Jacob

Abstract:

Porphyrins are noteworthy in biomedical science for their cancer tissue accumulation and photophysical properties. The preferential accumulation of some porphyrins in cancerous tissue has been known for many years. This, combined with their characteristic photophysical and photochemical properties, including their strong fluorescence and their ability to generate reactive oxygen species in vivo upon laser irradiation, has led to much research into the application of porphyrins as cancer diagnostic and therapeutic agents. Porphyrins have been used as dyes to detect cancer cells both in vivo and, less commonly, in vitro. In one example, human sputum samples from lung cancer patients and patients without the disease were dissociated and stained with the porphyrin TCPP (5,10,15,20-tetrakis-(4-carboxyphenyl)-porphine). Cells were analyzed by flow cytometry. Cancer samples were identified by their higher TCPP fluorescence intensity relative to the no-cancer controls. However, quantitative analysis of fluorescence in cell suspensions stained with multiple fluorophores requires particles stained with each of the individual fluorophores as controls. Fluorescent control particles must be compatible in size with flow cytometer fluidics and have favorable hydrodynamic properties in suspension. They must also display fluorescence comparable to the cells of interest and be stable upon storage amine-functionalized spherical polystyrene beads in the 5 to 20-micron diameter range that was reacted with TCPP and EDC in aqueous pH six buffer overnight to form amide bonds. Beads were isolated by centrifugation and tested by flow cytometry. The 10-micron amine-functionalized beads displayed the best combination of fluorescence intensity and hydrodynamic properties, such as lack of clumping and remaining in suspension during the experiment. These beads were further optimized by varying the stoichiometry of EDC and TCPP relative to the amine. The reaction was accompanied by the formation of a TCPP-related particulate, which was removed, after bead centrifugation, using a microfiltration process. The resultant TCPP-functionalized beads were compatible with flow cytometry conditions and displayed a fluorescence comparable to that of stained cells, which allowed their use as fluorescence standards. The beads were stable in refrigerated storage in the dark for more than eight months. This work demonstrates the first preparation of porphyrin-functionalized flow cytometry control beads.

Keywords: tetraaryl porphyrin, polystyrene beads, flow cytometry, peptide coupling

Procedia PDF Downloads 77
1999 Full Mini Nutritional Assessment Questionnaire and the Risk of Malnutrition and Mortality in Elderly, Hospitalized Patients: A Cross-Sectional Study

Authors: Christos E. Lampropoulos, Maria Konsta, Tamta Sirbilatze, Ifigenia Apostolou, Vicky Dradaki, Konstantina Panouria, Irini Dri, Christina Kordali, Vaggelis Lambas, Georgios Mavras

Abstract:

Objectives: Full Mini Nutritional Assessment (MNA) questionnaire is one of the most useful tools in diagnosis of malnutrition in hospitalized patients, which is related to increased morbidity and mortality. The purpose of our study was to assess the nutritional status of elderly, hospitalized patients and examine the hypothesis that MNA may predict mortality and extension of hospitalization. Methods: One hundred fifty patients (78 men, 72 women, mean age 80±8.2) were included in this cross-sectional study. The following data were taken into account in analysis: anthropometric and laboratory data, physical activity (International Physical Activity Questionnaires, IPAQ), smoking status, dietary habits, cause and duration of current admission, medical history (co-morbidities, previous admissions). Primary endpoints were mortality (from admission until 6 months afterwards) and duration of admission. The latter was compared to national guidelines for closed consolidated medical expenses. Logistic regression and linear regression analysis were performed in order to identify independent predictors for mortality and extended hospitalization respectively. Results: According to MNA, nutrition was normal in 54/150 (36%) of patients, 46/150 (30.7%) of them were at risk of malnutrition and the rest 50/150 (33.3%) were malnourished. After performing multivariate logistic regression analysis we found that the odds of death decreased 20% per each unit increase of full MNA score (OR=0.8, 95% CI 0.74-0.89, p < 0.0001). Patients who admitted due to cancer were 23 times more likely to die, compared to those with infection (OR=23, 95% CI 3.8-141.6, p=0.001). Similarly, patients who admitted due to stroke were 7 times more likely to die (OR=7, 95% CI 1.4-34.5, p=0.02), while these with all other causes of admission were less likely (OR=0.2, 95% CI 0.06-0.8, p=0.03), compared to patients with infection. According to multivariate linear regression analysis, each increase of unit of full MNA, decreased the admission duration on average 0.3 days (b:-0.3, 95% CI -0.45 - -0.15, p < 0.0001). Patients admitted due to cancer had on average 6.8 days higher extension of hospitalization, compared to those admitted for infection (b:6.8, 95% CI 3.2-10.3, p < 0.0001). Conclusion: Mortality and extension of hospitalization is significantly increased in elderly, malnourished patients. Full MNA score is a useful diagnostic tool of malnutrition.

Keywords: duration of admission, malnutrition, mini nutritional assessment score, prognostic factors for mortality

Procedia PDF Downloads 301
1998 “I” on the Web: Social Penetration Theory Revised

Authors: Dr. Dionysis Panos Dpt. Communication, Internet Studies Cyprus University of Technology

Abstract:

The widespread use of New Media and particularly Social Media, through fixed or mobile devices, has changed in a staggering way our perception about what is “intimate" and "safe" and what is not, in interpersonal communication and social relationships. The distribution of self and identity-related information in communication now evolves under new and different conditions and contexts. Consequently, this new framework forces us to rethink processes and mechanisms, such as what "exposure" means in interpersonal communication contexts, how the distinction between the "private" and the "public" nature of information is being negotiated online, how the "audiences" we interact with are understood and constructed. Drawing from an interdisciplinary perspective that combines sociology, communication psychology, media theory, new media and social networks research, as well as from the empirical findings of a longitudinal comparative research, this work proposes an integrative model for comprehending mechanisms of personal information management in interpersonal communication, which can be applied to both types of online (Computer-Mediated) and offline (Face-To-Face) communication. The presentation is based on conclusions drawn from a longitudinal qualitative research study with 458 new media users from 24 countries for almost over a decade. Some of these main conclusions include: (1) There is a clear and evidenced shift in users’ perception about the degree of "security" and "familiarity" of the Web, between the pre- and the post- Web 2.0 era. The role of Social Media in this shift was catalytic. (2) Basic Web 2.0 applications changed dramatically the nature of the Internet itself, transforming it from a place reserved for “elite users / technical knowledge keepers" into a place of "open sociability” for anyone. (3) Web 2.0 and Social Media brought about a significant change in the concept of “audience” we address in interpersonal communication. The previous "general and unknown audience" of personal home pages, converted into an "individual & personal" audience chosen by the user under various criteria. (4) The way we negotiate the nature of 'private' and 'public' of the Personal Information, has changed in a fundamental way. (5) The different features of the mediated environment of online communication and the critical changes occurred since the Web 2.0 advance, lead to the need of reconsideration and updating the theoretical models and analysis tools we use in our effort to comprehend the mechanisms of interpersonal communication and personal information management. Therefore, is proposed here a new model for understanding the way interpersonal communication evolves, based on a revision of social penetration theory.

Keywords: new media, interpersonal communication, social penetration theory, communication exposure, private information, public information

Procedia PDF Downloads 350
1997 A Crowdsourced Homeless Data Collection System And Its Econometric Analysis: Strengthening Inclusive Public Administration Policies

Authors: Praniil Nagaraj

Abstract:

This paper proposes a method to collect homeless data using crowdsourcing and presents an approach to analyze the data, demonstrating its potential to strengthen existing and future policies aimed at promoting socio-economic equilibrium. The 2022 Annual Homeless Assessment Report (AHAR) to Congress highlighted alarming statistics, emphasizing the need for effective decision-making and budget allocation within local planning bodies known as Continuums of Care (CoC). This paper's contributions can be categorized into three main areas. Firstly, a unique method for collecting homeless data is introduced, utilizing a user-friendly smartphone app (currently available for Android). The app enables the general public to quickly record information about homeless individuals, including the number of people and details about their living conditions. The collected data, including date, time, and location, is anonymized and securely transmitted to the cloud. It is anticipated that an increasing number of users motivated to contribute to society will adopt the app, thus expanding the data collection efforts. Duplicate data is addressed through simple classification methods, and historical data is utilized to fill in missing information. The second contribution of this paper is the description of data analysis techniques applied to the collected data. By combining this new data with existing information, statistical regression analysis is employed to gain insights into various aspects, such as distinguishing between unsheltered and sheltered homeless populations, as well as examining their correlation with factors like unemployment rates, housing affordability, and labor demand. Initial data is collected in San Francisco, while pre-existing information is drawn from three cities: San Francisco, New York City, and Washington D.C., facilitating the conduction of simulations. The third contribution focuses on demonstrating the practical implications of the data processing results. The challenges faced by key stakeholders, including charitable organizations and local city governments, are taken into consideration. Two case studies are presented as examples. The first case study explores improving the efficiency of food and necessities distribution, as well as medical assistance, driven by charitable organizations. The second case study examines the correlation between micro-geographic budget expenditure by local city governments and homeless information to justify budget allocation and expenditures. The ultimate objective of this endeavor is to enable the continuous enhancement of the quality of life for the underprivileged. It is hoped that through increased crowdsourcing of data from the public, the Generosity Curve and the Need Curve will intersect, leading to a better world for all.

Keywords: crowdsourcing, homelessness, socio-economic policies, statistical regression

Procedia PDF Downloads 71
1996 The Contact between a Rigid Substrate and a Thick Elastic Layer

Authors: Nicola Menga, Giuseppe Carbone

Abstract:

Although contact mechanics has been widely focused on the study of contacts between half-space, it has been recently pointed out that in presence of finite thickness elastic layers the results of the contact problem show significant difference in terms of the main contact quantities (e.g. contact area, penetration, mean pressure, etc.). Actually, there exist a wide range of industrial application demanding for this kind of studies, such as seals leakage prediction or pressure-sensitive coatings for electrical applications. In this work, we focus on the contact between a rigid profile and an elastic layer of thickness h confined under two different configurations: rigid constrain and applied uniform pressure. The elastic problem at hand has been formalized following Green’s function method and then numerically solved by means of a matrix inversion. We study different contact conditions, both considering and neglecting adhesive interactions at the interface. This leads to different solution techniques: Adhesive contacts equilibrium solution is found, in term of contact area for given penetration, making stationary the total free energy of the system; whereas, adhesiveless contacts are addressed defining an equilibrium criterion, again on the contact area, relying on the fracture mechanics stress intensity factor KI. In particular, we make the KI vanish at the edges of the contact area, as peculiar for adhesiveless elastic contacts. The results are obtained in terms of contact area, penetration, and mean pressure for both adhesive and adhesiveless contact conditions. As expected, in the case of a uniform applied pressure the slab turns out much more compliant than the rigidly constrained one. Indeed, we have observed that the peak value of the contact pressure, for both the adhesive and adhesiveless condition, is much higher for the rigidly constrained configuration than in the case of applied uniform pressure. Furthermore, we observed that, for little contact area, both systems behave the same and the pull-off occurs at approximately the same contact area and mean contact pressure. This is an expected result since in this condition the ratio between the layers thickness and the contact area is very high and both layer configurations recover the half-space behavior where the pull-off occurrence is mainly controlled by the adhesive interactions, which are kept constant among the cases.

Keywords: contact mechanics, adhesion, friction, thick layer

Procedia PDF Downloads 492
1995 Effects of Feed Forms on Growth Pattern, Behavioural Responses and Fecal Microbial Load of Pigs Fed Diets Supplemented with Saccaromyces cereviseae Probiotics

Authors: O. A. Adebiyi, A. O. Oni, A. O. K. Adeshehinwa, I. O. Adejumo

Abstract:

In forty nine (49) days, twenty four (24) growing pigs (Landrace x Large white) with an average weight of 17 ±2.1kg were allocated to four experimental treatments T1 (dry mash without probiotics), T2 (wet feed without probiotics), T3 (dry mash + Saccaromyces cereviseae probiotics) and T4 (wet feed + Saccaromyces cereviseae probiotics) which were replicated three times with two pigs per replicate in a completely randomised design. The basal feed (dry feed) was formulated to meet the nutritional requirement of the animal with crude protein of 18.00% and metabolisable energy of 2784.00kcal/kgME. Growth pattern, faecal microbial load and behavioural activities (eating, drinking, physical pen interaction and frequency of visiting the drinking troughs) were accessed. Pigs fed dry mash without probiotics (T1) had the highest daily feed intake among the experimental animals (1.10kg) while pigs on supplemented diets (T3 and T4) had an average daily feed intake of 0.95kg. However, the feed conversion ratio was significantly (p < 0.05) affected with pigs on T3 having least value of 6.26 compared those on T4 (wet feed + Saccaromyces cereviseae) with means of 7.41. Total organism counts varied significantly (p < 0.05) with pigs on T1, T2, T3 and T4 with mean values of 179.50 x106cfu; 132.00 x 106cfu; 32.00 x 106cfu and 64.50 x 106cfu respectively. Coliform count was also significantly (p < 0.05) different among the treatments with corresponding values of 117.50 x 106cfu; 49.00 x 106cfu, 8.00 x 106cfu for pigs in T1, T2 and T4 respectively. The faecal Saccaromyces cereviseae was significantly lower in pigs fed supplemented diets compared to their counterparts on unsupplemented diets. This could be due to the inability of yeast organisms to be voided easily through feaces. The pigs in T1 spent the most time eating (7.88%) while their counterparts on T3 spent the least time eating. The corresponding physical pen interaction times expressed in percentage of a day for pigs in T1, T2, T3 and T4 are 6.22%, 5.92%, 4.04% and 4.80% respectively. These behavioural responses exhibited by these pigs (T3) showed that little amount of dry feed supplemented with probiotics is needed for better performance. The water intake increases as a result of the dryness of the feed with consequent decrease in pen interaction and more time was spent resting than engaging in other possible vice-habit like fighting or tail biting. Pigs fed dry feed (T3) which was supplemented with Saccaromyces cereviseae probiotics had a better overall performance, least faecal microbial load than wet fed pigs either supplemented with Saccaromyces cereviseae or non-supplemented.

Keywords: behaviour, feed forms, feed utilization, growth, microbial

Procedia PDF Downloads 333
1994 The Effectiveness of Blended Learning in Pre-Registration Nurse Education: A Mixed Methods Systematic Review and Met Analysis

Authors: Albert Amagyei, Julia Carroll, Amanda R. Amorim Adegboye, Laura Strumidlo, Rosie Kneafsey

Abstract:

Introduction: Classroom-based learning has persisted as the mainstream model of pre-registration nurse education. This model is often rigid, teacher-centered, and unable to support active learning and the practical learning needs of nursing students. Health Education England (HEE), a public body of the Department of Health and Social Care, hypothesises that blended learning (BL) programmes may address health system and nursing profession challenges, such as nursing shortages and lack of digital expertise, by exploring opportunities for providing predominantly online, remote-access study which may increase nursing student recruitment, offering alternate pathways to nursing other than the traditional classroom route. This study will provide evidence for blended learning strategies adopted in nursing education as well as examine nursing student learning experiences concerning the challenges and opportunities related to using blended learning within nursing education. Objective: This review will explore the challenges and opportunities of BL within pre-registration nurse education from the student's perspective. Methods: The search was completed within five databases. Eligible studies were appraised independently by four reviewers. The JBI-convergent segregated approach for mixed methods review was used to assess and synthesize the data. The study’s protocol has been registered with the International Register of Systematic Reviews with registration number// PROSPERO (CRD42023423532). Results: Twenty-seven (27) studies (21 quantitative and 6 qualitative) were included in the review. The study confirmed that BL positively impacts nursing students' learning outcomes, as demonstrated by the findings of the meta-analysis and meta-synthesis. Conclusion: The review compared BL to traditional learning, simulation, laboratory, and online learning on nursing students’ learning and programme outcomes as well as learning behaviour and experience. The results show that BL could effectively improve nursing students’ knowledge, academic achievement, critical skills, and clinical performance as well as enhance learner satisfaction and programme retention. The review findings outline that students’ background characteristics, BL design, and format significantly impact the success of the BL nursing programme.

Keywords: nursing student, blended learning, pre-registration nurse education, online learning

Procedia PDF Downloads 33
1993 Patient Experience in a Healthcare Setting: How Patients' Encounters Make for Better Value Co-creation

Authors: Kingsley Agyapong

Abstract:

Research conducted in recent years has delved into the concept of patient-perceived value within the context of co-creation, particularly in the realm of doctor-patient interactions within healthcare settings. However, existing scholarly discourse lacks exploration regarding the emergence of patient-derived value in the co-creation process, specifically within encounters involving patients and stakeholders such as doctors, nurses, pharmacists, and other healthcare professionals. This study aims to fill this gap by elucidating the perspectives of patients regarding the value they derive from their interactions with multiple stakeholders in the delivery of healthcare services. The fieldwork was conducted at a university clinic located in Ghana. Data collection procedures involved conducting 20 individual interviews with key informants on distinct value accrued from co-creation practices and interactions with stakeholders. The key informants consisted of patients receiving care at the university clinic during the Malaria Treatment Process. Three themes emerged from both the existing literature and the empirical data collected. The first theme, labeled as "patient value needs in co-creation," encapsulates elements such as communication effectiveness, interpersonal interaction quality, treatment efficacy, and enhancements to the overall quality of life experienced by patients during their interactions with healthcare professionals. The second theme, designated as "services that enhance patients' experience in value co-creation," pertains to patients' perceptions of services that contribute favourably to co-creation experiences, including initiatives related to health promotion and the provision of various in-house services that patients deem pertinent for augmenting their overall experiences. The third theme, titled "Challenges in the co-creation of patients' value," delineates obstacles encountered within the co-creation process, including health professionals' challenges in effectively following up with patients scheduled for review and prolonged waiting times for healthcare delivery. This study contributes to the patients' perceptions of value within the co-creation process during their interactions with service providers, particularly healthcare professionals. By gaining a deeper insight into this process, healthcare providers can enhance the delivery of patient-centered care, thereby leading to improved healthcare outcomes. The study further offers managerial implications derived from its findings, providing actionable insights for healthcare managers and policymakers aiming to optimize patient value creation in healthcare services. Furthermore, it suggests avenues for future research endeavors within healthcare settings.

Keywords: patient, healthcare, co-creation, malaria

Procedia PDF Downloads 29
1992 User Experience Evaluation on the Usage of Commuter Line Train Ticket Vending Machine

Authors: Faishal Muhammad, Erlinda Muslim, Nadia Faradilla, Sayidul Fikri

Abstract:

To deal with the increase of mass transportation needs problem, PT. Kereta Commuter Jabodetabek (KCJ) implements Commuter Vending Machine (C-VIM) as the solution. For that background, C-VIM is implemented as a substitute to the conventional ticket windows with the purposes to make transaction process more efficient and to introduce self-service technology to the commuter line user. However, this implementation causing problems and long queues when the user is not accustomed to using the machine. The objective of this research is to evaluate user experience after using the commuter vending machine. The goal is to analyze the existing user experience problem and to achieve a better user experience design. The evaluation method is done by giving task scenario according to the features offered by the machine. The features are daily insured ticket sales, ticket refund, and multi-trip card top up. There 20 peoples that separated into two groups of respondents involved in this research, which consist of 5 males and 5 females each group. The experienced and inexperienced user to prove that there is a significant difference between both groups in the measurement. The user experience is measured by both quantitative and qualitative measurement. The quantitative measurement includes the user performance metrics such as task success, time on task, error, efficiency, and learnability. The qualitative measurement includes system usability scale questionnaire (SUS), questionnaire for user interface satisfaction (QUIS), and retrospective think aloud (RTA). Usability performance metrics shows that 4 out of 5 indicators are significantly different in both group. This shows that the inexperienced group is having a problem when using the C-VIM. Conventional ticket windows also show a better usability performance metrics compared to the C-VIM. From the data processing, the experienced group give the SUS score of 62 with the acceptability scale of 'marginal low', grade scale of “D”, and the adjective ratings of 'good' while the inexperienced group gives the SUS score of 51 with the acceptability scale of 'marginal low', grade scale of 'F', and the adjective ratings of 'ok'. This shows that both groups give a low score on the system usability scale. The QUIS score of the experienced group is 69,18 and the inexperienced group is 64,20. This shows the average QUIS score below 70 which indicate a problem with the user interface. RTA was done to obtain user experience issue when using C-VIM through interview protocols. The issue obtained then sorted using pareto concept and diagram. The solution of this research is interface redesign using activity relationship chart. This method resulted in a better interface with an average SUS score of 72,25, with the acceptable scale of 'acceptable', grade scale of 'B', and the adjective ratings of 'excellent'. From the time on task indicator of performance metrics also shows a significant better time by using the new interface design. Result in this study shows that C-VIM not yet have a good performance and user experience.

Keywords: activity relationship chart, commuter line vending machine, system usability scale, usability performance metrics, user experience evaluation

Procedia PDF Downloads 247
1991 Locus of Control and Self-Esteem as Predictors of Maternal and Child Healthcare Services Utilization in Nigeria

Authors: Josephine Aikpitanyi, Friday Okonofua, Lorrettantoimo, Sandy Tubeuf

Abstract:

Every day, 800 women die from conditions related to pregnancy and childbirth, resulting in an estimated 300,000 maternal deaths worldwide per year. Over 99 percent of all maternal deaths occur in developing countries, with more than half of them occurring in sub-Saharan Africa. Nigeria being the most populous nation in sub-Saharan Africa bears a significant burden of worsening maternal and child health outcomes with a maternal mortality rate of 917 per 100,000 live births and child mortality rate of 117 per 1,000 live births. While several studies have documented that financial barriers disproportionately discourage poor women from seeking needed maternal and child healthcare, other studies have indicated otherwise. Evidence shows that there are instances where health facilities with skilled healthcare providers exist, and yet maternal, and child health outcomes remain abysmally low, indicating the presence of non-cognitive and behavioural factors that may affect the utilization of healthcare services. This study investigated the influence of locus of control and self-esteem on utilization of maternal and child healthcare services in Nigeria. Specifically, it explored the differences in utilization of antenatal care, skilled birth care, postnatal care, and child vaccination by women having an internal and external locus of control and women having high and low self-esteem. We collected information on non-cognitive traits of 1411 randomly selected women, along with information on utilization of the various indicators of maternal and child healthcare. Estimating logistic regression models for various components of healthcare services utilization, we found that women’s internal locus of control was a significant predictor of utilization of antenatal care, skilled birth care, and completion of child vaccination. We also found that having high self-esteem was a significant predictor of utilization of antenatal care, postnatal care, and completion of child vaccination after adjusting for other control variables. By improving our understanding of non-cognitive traits as possible barriers to maternal and child healthcare utilization, our findings offer important insights for enhancing participant engagement in intervention programs that are initiated to improve maternal and child health outcomes in low-and-middle-income countries.

Keywords: behavioural economics, health-seeking behaviour, locus of control and self-esteem, maternal and child healthcare, non-cognitive traits, and healthcare utilization

Procedia PDF Downloads 147
1990 Experimental Study and Numerical Modelling of Failure of Rocks Typical for Kuzbass Coal Basin

Authors: Mikhail O. Eremin

Abstract:

Present work is devoted to experimental study and numerical modelling of failure of rocks typical for Kuzbass coal basin (Russia). The main goal was to define strength and deformation characteristics of rocks on the base of uniaxial compression and three-point bending loadings and then to build a mathematical model of failure process for both types of loading. Depending on particular physical-mechanical characteristics typical rocks of Kuzbass coal basin (sandstones, siltstones, mudstones, etc. of different series – Kolchuginsk, Tarbagansk, Balohonsk) manifest brittle and quasi-brittle character of failure. The strength characteristics for both tension and compression are found. Other characteristics are also found from the experiment or taken from literature reviews. On the base of obtained characteristics and structure (obtained from microscopy) the mathematical and structural models are built and numerical modelling of failure under different types of loading is carried out. Effective characteristics obtained from modelling and character of failure correspond to experiment and thus, the mathematical model was verified. An Instron 1185 machine was used to carry out the experiments. Mathematical model includes fundamental conservation laws of solid mechanics – mass, impulse, energy. Each rock has a sufficiently anisotropic structure, however, each crystallite might be considered as isotropic and then a whole rock model has a quasi-isotropic structure. This idea gives an opportunity to use the Hooke’s law inside of each crystallite and thus explicitly accounting for the anisotropy of rocks and the stress-strain state at loading. Inelastic behavior is described in frameworks of two different models: von Mises yield criterion and modified Drucker-Prager yield criterion. The damage accumulation theory is also implemented in order to describe a failure process. Obtained effective characteristics of rocks are used then for modelling of rock mass evolution when mining is carried out both by an open-pit or underground opening.

Keywords: damage accumulation, Drucker-Prager yield criterion, failure, mathematical modelling, three-point bending, uniaxial compression

Procedia PDF Downloads 158
1989 Transparency Obligations under the AI Act Proposal: A Critical Legal Analysis

Authors: Michael Lognoul

Abstract:

In April 2021, the European Commission released its AI Act Proposal, which is the first policy proposal at the European Union level to target AI systems comprehensively, in a horizontal manner. This Proposal notably aims to achieve an ecosystem of trust in the European Union, based on the respect of fundamental rights, regarding AI. Among many other requirements, the AI Act Proposal aims to impose several generic transparency obligationson all AI systems to the benefit of natural persons facing those systems (e.g. information on the AI nature of systems, in case of an interaction with a human). The Proposal also provides for more stringent transparency obligations, specific to AI systems that qualify as high-risk, to the benefit of their users, notably on the characteristics, capabilities, and limitations of the AI systems they use. Against that background, this research firstly presents all such transparency requirements in turn, as well as related obligations, such asthe proposed obligations on record keeping. Secondly, it focuses on a legal analysis of their scope of application, of the content of the obligations, and on their practical implications. On the scope of transparency obligations tailored for high-risk AI systems, the research notably notes that it seems relatively narrow, given the proposed legal definition of the notion of users of AI systems. Hence, where end-users do not qualify as users, they may only receive very limited information. This element might potentially raise concern regarding the objective of the Proposal. On the content of the transparency obligations, the research highlights that the information that should benefit users of high-risk AI systems is both very broad and specific, from a technical perspective. Therefore, the information required under those obligations seems to create, prima facie, an adequate framework to ensure trust for users of high-risk AI systems. However, on the practical implications of these transparency obligations, the research notes that concern arises due to potential illiteracy of high-risk AI systems users. They might not benefit from sufficient technical expertise to fully understand the information provided to them, despite the wording of the Proposal, which requires that information should be comprehensible to its recipients (i.e. users).On this matter, the research points that there could be, more broadly, an important divergence between the level of detail of the information required by the Proposal and the level of expertise of users of high-risk AI systems. As a conclusion, the research provides policy recommendations to tackle (part of) the issues highlighted. It notably recommends to broaden the scope of transparency requirements for high-risk AI systems to encompass end-users. It also suggests that principles of explanation, as they were put forward in the Guidelines for Trustworthy AI of the High Level Expert Group, should be included in the Proposal in addition to transparency obligations.

Keywords: aI act proposal, explainability of aI, high-risk aI systems, transparency requirements

Procedia PDF Downloads 278
1988 Infusion of Skills for Undergraduate Scholarship into Teacher Education: Two Case Studies in New York and Florida

Authors: Tunde Szecsi, Janka Szilagyi

Abstract:

Students majoring in education are underrepresented in undergraduate scholarship. To enable and encourage teacher candidates to engage in scholarly activities, it is essential to infuse skills such as problem-solving, critical thinking, oral and written communication, collaboration and the utilization of information literacy, into courses in teacher preparation programs. In this empirical study, we examined two teacher education programs – one in New York State and one in Florida – in terms of the approaches of the course-based infusion of skills for undergraduate research, and the effectiveness of this infusion. First, course-related documents such as syllabi, assignment descriptions, and course activities were reviewed and analyzed. The goal of the document analysis was to identify and describe the targeted skills, and the pedagogical approaches and strategies for promoting research skills in teacher candidates. Next, a selection of teacher candidates’ scholarly products from the institution in Florida was used as a data set to examine teacher candidates’ skill development in the context of the identified assignments. This dataset was analyzed both quantitatively and qualitatively to describe the changes that occurred in teacher candidates’ critical thinking, communication, and information literacy skills, and to uncover patterns in the skill development at the two institutions. Descriptive statistics were calculated to explore the changes in these skills of teacher candidates over a period of three years. The findings based on data from the teacher education program in Florida indicated a steady gain in written communication and critical thinking and a modest increase in informational literacy. At the institution in New York, candidates’ submission and success rates on the edTPA, a New York State Teacher Certification exam, was used as a measure of scholarly skills. Overall, although different approaches were used for infusing the development of scholarly skills in the courses, the results suggest that a holistic and well-orchestrated infusion of the skills into most courses in the teacher education program might result in steadily developing scholarly skills. These results offered essential implications for teacher education programs in terms of further improvements in teacher candidates’ skills for engaging in undergraduate research and scholarship. In this presentation, our purpose is to showcase two approaches developed by two teacher education programs to demonstrate how diverse approaches toward the promotion of undergraduate scholarship activities are responsive to the context of the teacher preparation programs.

Keywords: critical thinking, pedagogical strategies, teacher education, undergraduate student research

Procedia PDF Downloads 140
1987 Economics of Precision Mechanization in Wine and Table Grape Production

Authors: Dean A. McCorkle, Ed W. Hellman, Rebekka M. Dudensing, Dan D. Hanselka

Abstract:

The motivation for this study centers on the labor- and cost-intensive nature of wine and table grape production in the U.S., and the potential opportunities for precision mechanization using robotics to augment those production tasks that are labor-intensive. The objectives of this study are to evaluate the economic viability of grape production in five U.S. states under current operating conditions, identify common production challenges and tasks that could be augmented with new technology, and quantify a maximum price for new technology that growers would be able to pay. Wine and table grape production is primed for precision mechanization technology as it faces a variety of production and labor issues. Methodology: Using a grower panel process, this project includes the development of a representative wine grape vineyard in five states and a representative table grape vineyard in California. The panels provided production, budget, and financial-related information that are typical for vineyards in their area. Labor costs for various production tasks are of particular interest. Using the data from the representative budget, 10-year projected financial statements have been developed for the representative vineyard and evaluated using a stochastic simulation model approach. Labor costs for selected vineyard production tasks were evaluated for the potential of new precision mechanization technology being developed. These tasks were selected based on a variety of factors, including input from the panel members, and the extent to which the development of new technology was deemed to be feasible. The net present value (NPV) of the labor cost over seven years for each production task was derived. This allowed for the calculation of a maximum price for new technology whereby the NPV of labor costs would equal the NPV of purchasing, owning, and operating new technology. Expected Results: The results from the stochastic model will show the projected financial health of each representative vineyard over the 2015-2024 timeframe. Investigators have developed a preliminary list of production tasks that have the potential for precision mechanization. For each task, the labor requirements, labor costs, and the maximum price for new technology will be presented and discussed. Together, these results will allow technology developers to focus and prioritize their research and development efforts for wine and table grape vineyards, and suggest opportunities to strengthen vineyard profitability and long-term viability using precision mechanization.

Keywords: net present value, robotic technology, stochastic simulation, wine and table grapes

Procedia PDF Downloads 245
1986 Understanding the Relationship between Community and the Preservation of Cultural Landscape - Focusing on Organically Evolved Landscapes

Authors: Adhithy Menon E., Biju C. A.

Abstract:

Heritage monuments were first introduced to the public in the 1960s when the concept of preserving them was introduced. As a result of the 1990s, the concept of cultural landscapes gained importance, emphasizing the importance of culture and heritage in the context of the landscape. It is important to note that this paper is primarily concerned with the second category of ecological landscapes, which is organically evolving landscapes, as they represent a complex network of tangible, intangible, and environment, and the connections they share with the communities in which they are situated. The United Nations Educational, Scientific, and Cultural Organization has identified 39 cultural sites as being in danger, including the Iranian city of Bam and the historic city of Zabid in Yemen. To ensure its protection in the future, it is necessary to conduct a detailed analysis of the factors contributing to this degradation. An analysis of selected cultural landscapes from around the world is conducted to determine which parameters cause their degradation. The paper follows the objectives of understanding cultural landscapes and their importance for development, followed by examining various criteria for identifying cultural landscapes, their various classifications, as well as agencies that focus on their protection. To identify and analyze the parameters contributing to the deterioration of cultural landscapes based on literature and case studies (cultural landscape of Sintra, Rio de Janeiro, and Varanasi). As a final step, strategies should be developed to enhance deteriorating cultural landscapes based on these parameters. The major findings of the study are the impact of community in the parameters derived - integrity (natural factors, natural disasters, demolition of structures, deterioration of materials), authenticity (living elements, sense of place, building techniques, religious context, artistic expression) public participation (revenue, dependence on locale), awareness (demolition of structures, resource management) disaster management, environmental impact, maintenance of cultural landscape (linkages with other sites, dependence on locale, revenue, resource management). The parameters of authenticity, public participation, awareness, and maintenance of the cultural landscape are directly related to the community in which the cultural landscape is located. Therefore, by focusing on the community and addressing the parameters identified, the deterioration curve of cultural landscapes can be altered.

Keywords: community, cultural landscapes, heritage, organically evolved, public participation

Procedia PDF Downloads 66
1985 Identifying Confirmed Resemblances in Problem-Solving Engineering, Both in the Past and Present

Authors: Colin Schmidt, Adrien Lecossier, Pascal Crubleau, Simon Richir

Abstract:

Introduction:The widespread availability of artificial intelligence, exemplified by Generative Pre-trained Transformers (GPT) relying on large language models (LLM), has caused a seismic shift in the realm of knowledge. Everyone now has the capacity to swiftly learn how these models can either serve them well or not. Today, conversational AI like ChatGPT is grounded in neural transformer models, a significant advance in natural language processing facilitated by the emergence of renowned LLMs constructed using neural transformer architecture. Inventiveness of an LLM : OpenAI's GPT-3 stands as a premier LLM, capable of handling a broad spectrum of natural language processing tasks without requiring fine-tuning, reliably producing text that reads as if authored by humans. However, even with an understanding of how LLMs respond to questions asked, there may be lurking behind OpenAI’s seemingly endless responses an inventive model yet to be uncovered. There may be some unforeseen reasoning emerging from the interconnection of neural networks here. Just as a Soviet researcher in the 1940s questioned the existence of Common factors in inventions, enabling an Under standing of how and according to what principles humans create them, it is equally legitimate today to explore whether solutions provided by LLMs to complex problems also share common denominators. Theory of Inventive Problem Solving (TRIZ) : We will revisit some fundamentals of TRIZ and how Genrich ALTSHULLER was inspired by the idea that inventions and innovations are essential means to solve societal problems. It's crucial to note that traditional problem-solving methods often fall short in discovering innovative solutions. The design team is frequently hampered by psychological barriers stemming from confinement within a highly specialized knowledge domain that is difficult to question. We presume ChatGPT Utilizes TRIZ 40. Hence, the objective of this research is to decipher the inventive model of LLMs, particularly that of ChatGPT, through a comparative study. This will enhance the efficiency of sustainable innovation processes and shed light on how the construction of a solution to a complex problem was devised. Description of the Experimental Protocol : To confirm or reject our main hypothesis that is to determine whether ChatGPT uses TRIZ, we will follow a stringent protocol that we will detail, drawing on insights from a panel of two TRIZ experts. Conclusion and Future Directions : In this endeavor, we sought to comprehend how an LLM like GPT addresses complex challenges. Our goal was to analyze the inventive model of responses provided by an LLM, specifically ChatGPT, by comparing it to an existing standard model: TRIZ 40. Of course, problem solving is our main focus in our endeavours.

Keywords: artificial intelligence, Triz, ChatGPT, inventiveness, problem-solving

Procedia PDF Downloads 46
1984 The Response of Mammal Populations to Abrupt Changes in Fire Regimes in Montane Landscapes of South-Eastern Australia

Authors: Jeremy Johnson, Craig Nitschke, Luke Kelly

Abstract:

Fire regimes, climate and topographic gradients interact to influence ecosystem structure and function across fire-prone, montane landscapes worldwide. Biota have developed a range of adaptations to historic fire regime thresholds, which allow them to persist in these environments. In south-eastern Australia, a signal of fire regime changes is emerging across these landscapes, and anthropogenic climate change is likely to be one of the main drivers of an increase in burnt area and more frequent wildfire over the last 25 years. This shift has the potential to modify vegetation structure and composition at broad scales, which may lead to landscape patterns to which biota are not adapted, increasing the likelihood of local extirpation of some mammal species. This study aimed to address concerns related to the influence of abrupt changes in fire regimes on mammal populations in montane landscapes. It first examined the impact of climate, topography, and vegetation on fire patterns and then explored the consequences of these changes on mammal populations and their habitats. Field studies were undertaken across diverse vegetation, fire severity and fire frequency gradients, utilising camera trapping and passive acoustic monitoring methodologies and the collection of fine-scale vegetation data. Results show that drought is a primary contributor to fire regime shifts at the landscape scale, while topographic factors have a variable influence on wildfire occurrence at finer scales. Frequent, high severity wildfire influenced forest structure and composition at broad spatial scales, and at fine scales, it reduced occurrence of hollow-bearing trees and promoted coarse woody debris. Mammals responded differently to shifts in forest structure and composition depending on their habitat requirements. This study highlights the complex interplay between fire regimes, environmental gradients, and biotic adaptations across temporal and spatial scales. It emphasizes the importance of understanding complex interactions to effectively manage fire-prone ecosystems in the face of climate change.

Keywords: fire, ecology, biodiversity, landscape ecology

Procedia PDF Downloads 52
1983 Description of a Structural Health Monitoring and Control System Using Open Building Information Modeling

Authors: Wahhaj Ahmed Farooqi, Bilal Ahmad, Sandra Maritza Zambrano Bernal

Abstract:

In view of structural engineering, monitoring of structural responses over time is of great importance with respect to recent developments of construction technologies. Recently, developments of advanced computing tools have enabled researcher’s better execution of structural health monitoring (SHM) and control systems. In the last decade, building information modeling (BIM) has substantially enhanced the workflow of planning and operating engineering structures. Typically, building information can be stored and exchanged via model files that are based on the Industry Foundation Classes (IFC) standard. In this study a modeling approach for semantic modeling of SHM and control systems is integrated into the BIM methodology using the IFC standard. For validation of the modeling approach, a laboratory test structure, a four-story shear frame structure, is modeled using a conventional BIM software tool. An IFC schema extension is applied to describe information related to monitoring and control of a prototype SHM and control system installed on the laboratory test structure. The SHM and control system is described by a semantic model applying Unified Modeling Language (UML). Subsequently, the semantic model is mapped into the IFC schema. The test structure is composed of four aluminum slabs and plate-to-column connections are fully fixed. In the center of the top story, semi-active tuned liquid column damper (TLCD) is installed. The TLCD is used to reduce effects of structural responses in context of dynamic vibration and displacement. The wireless prototype SHM and control system is composed of wireless sensor nodes. For testing the SHM and control system, acceleration response is automatically recorded by the sensor nodes equipped with accelerometers and analyzed using embedded computing. As a result, SHM and control systems can be described within open BIM, dynamic responses and information of damages can be stored, documented, and exchanged on the formal basis of the IFC standard.

Keywords: structural health monitoring, open building information modeling, industry foundation classes, unified modeling language, semi-active tuned liquid column damper, nondestructive testing

Procedia PDF Downloads 131
1982 A Study on the Effect of Design Factors of Slim Keyboard’s Tactile Feedback

Authors: Kai-Chieh Lin, Chih-Fu Wu, Hsiang Ling Hsu, Yung-Hsiang Tu, Chia-Chen Wu

Abstract:

With the rapid development of computer technology, the design of computers and keyboards moves towards a trend of slimness. The change of mobile input devices directly influences users’ behavior. Although multi-touch applications allow entering texts through a virtual keyboard, the performance, feedback, and comfortableness of the technology is inferior to traditional keyboard, and while manufacturers launch mobile touch keyboards and projection keyboards, the performance has not been satisfying. Therefore, this study discussed the design factors of slim pressure-sensitive keyboards. The factors were evaluated with an objective (accuracy and speed) and a subjective evaluation (operability, recognition, feedback, and difficulty) depending on the shape (circle, rectangle, and L-shaped), thickness (flat, 3mm, and 6mm), and force (35±10g, 60±10g, and 85±10g) of the keyboard. Moreover, MANOVA and Taguchi methods (regarding signal-to-noise ratios) were conducted to find the optimal level of each design factor. The research participants, by their typing speed (30 words/ minute), were divided in two groups. Considering the multitude of variables and levels, the experiments were implemented using the fractional factorial design. A representative model of the research samples were established for input task testing. The findings of this study showed that participants with low typing speed primarily relied on vision to recognize the keys, and those with high typing speed relied on tactile feedback that was affected by the thickness and force of the keys. In the objective and subjective evaluation, a combination of keyboard design factors that might result in higher performance and satisfaction was identified (L-shaped, 3mm, and 60±10g) as the optimal combination. The learning curve was analyzed to make a comparison with a traditional standard keyboard to investigate the influence of user experience on keyboard operation. The research results indicated the optimal combination provided input performance to inferior to a standard keyboard. The results could serve as a reference for the development of related products in industry and for applying comprehensively to touch devices and input interfaces which are interacted with people.

Keywords: input performance, mobile device, slim keyboard, tactile feedback

Procedia PDF Downloads 288
1981 Acoustic Emission for Investigation of Processes Occurring at Hydrogenation of Metallic Titanium

Authors: Anatoly A. Kuznetsov, Pavel G. Berezhko, Sergey M. Kunavin, Eugeny V. Zhilkin, Maxim V. Tsarev, Vyacheslav V. Yaroshenko, Valery V. Mokrushin, Olga Y. Yunchina, Sergey A. Mityashin

Abstract:

The acoustic emission is caused by short-time propagation of elastic waves that are generated as a result of quick energy release from sources localized inside some material. In particular, the acoustic emission phenomenon lies in the generation of acoustic waves resulted from the reconstruction of material internal structures. This phenomenon is observed at various physicochemical transformations, in particular, at those accompanying hydrogenation processes of metals or intermetallic compounds that make it possible to study parameters of these transformations through recording and analyzing the acoustic signals. It has been known that at the interaction between metals or inter metallides with hydrogen the most intensive acoustic signals are generated as a result of cracking or crumbling of an initial compact powder sample as a result of the change of material crystal structure under hydrogenation. This work is dedicated to the study into changes occurring in metallic titanium samples at their interaction with hydrogen and followed by acoustic emission signals. In this work the subjects for investigation were specimens of metallic titanium in two various initial forms: titanium sponge and fine titanium powder made of this sponge. The kinetic of the interaction of these materials with hydrogen, the acoustic emission signals accompanying hydrogenation processes and the structure of the materials before and after hydrogenation were investigated. It was determined that in both cases interaction of metallic titanium and hydrogen is followed by acoustic emission signals of high amplitude generated on reaching some certain value of the atomic ratio [H]/[Ti] in a solid phase because of metal cracking at a macrolevel. The typical sizes of the cracks are comparable with particle sizes of hydrogenated specimens. The reasons for cracking are internal stresses initiated in a sample due to the increasing volume of a solid phase as a result of changes in a material crystal lattice under hydrogenation. When the titanium powder is used, the atomic ratio [H]/[Ti] in a solid phase corresponding to the maximum amplitude of an acoustic emission signal are, as a rule, higher than when titanium sponge is used.

Keywords: acoustic emission signal, cracking, hydrogenation, titanium specimen

Procedia PDF Downloads 368
1980 Cultural Collisions, Ethics and HIV: On Local Values in a Globalized Medical World

Authors: Norbert W. Paul

Abstract:

In 1988, parts of the scientific community still heralded findings to support that AIDS was likely to remain largely a ‘gay disease’. The value-ladden terminology of some of the articles suggested that rectum and fragile urethra are not sufficiently robust to provide a barrier against infectious fluids, especially body fluids contaminated with HIV while the female vagina, would provide natural protection against injuries and trauma facilitating HIV-infection. Anal sexual intercourse was constituted not only as dangerous but also as unnatural practice, while penile-vaginal intercourse would follow natural design and thus be relatively safe practice minimizing the risk of HIV. Statements like the latter were not uncommon in the early times of HIV/AIDS and contributed to captious certainties and an underestimation of heterosexual risks. Pseudo-scientific discourses on the origin of HIV were linked to local and global health politics in the 1980ies. The pathways of infection were related to normative concepts like deviant, subcultural behavior, cultural otherness, and guilt used to target, tag and separate specific groups at risk from the ‘normal’ population. Controlling populations at risk became the top item on the agenda rather than controlling modes of transmission and the virus. Hence, the Thai strategy to cope with HIV/AIDS by acknowledging social and sexual practices as they were – not as they were imagined – has become a role model for successful prevention in the highly scandalized realm of sexually transmitted disease. By accepting the globalized character of local HIV-risk and projecting the risk onto populations which are neither particularly vocal groups nor vested with the means to strive for health and justice Thailand managed to culturally implement knowledge-based tools of prevention. This paper argues, that pertinent cultural collisions regarding our strategies to cope with HIV/AIDS are deeply rooted in misconceptions, misreadings and scandalizations brought about in the early history of HIV in the 1980ties. The Thai strategy is used to demonstrate how local values can be balanced against globalized health risk and used to effectuated prevention by which knowledge and norms are translated into local practices. Issues of global health and injustice will be addressed in the final part of the paper dealing with the achievability of health as a human right.

Keywords: bioethics, HIV, global health, justice

Procedia PDF Downloads 247
1979 A Systematic Map of the Research Trends in Wildfire Management in Mediterranean-Climate Regions

Authors: Renata Martins Pacheco, João Claro

Abstract:

Wildfires are becoming an increasing concern worldwide, causing substantial social, economic, and environmental disruptions. This situation is especially relevant in Mediterranean-climate regions, present in all the five continents of the world, in which fire is not only a natural component of the environment but also perhaps one of the most important evolutionary forces. The rise in wildfire occurrences and their associated impacts suggests the need for identifying knowledge gaps and enhancing the basis of scientific evidence on how managers and policymakers may act effectively to address them. Considering that the main goal of a systematic map is to collate and catalog a body of evidence to describe the state of knowledge for a specific topic, it is a suitable approach to be used for this purpose. In this context, the aim of this study is to systematically map the research trends in wildfire management practices in Mediterranean-climate regions. A total of 201 wildfire management studies were analyzed and systematically mapped in terms of their: Year of publication; Place of study; Scientific outlet; Research area (Web of Science) or Research field (Scopus); Wildfire phase; Central research topic; Main objective of the study; Research methods; and Main conclusions or contributions. The results indicate that there is an increasing number of studies being developed on the topic (most from the last 10 years), but more than half of them are conducted in few Mediterranean countries (60% of the analyzed studies were conducted in Spain, Portugal, Greece, Italy or France), and more than 50% are focused on pre-fire issues, such as prevention and fuel management. In contrast, only 12% of the studies focused on “Economic modeling” or “Human factors and issues,” which suggests that the triple bottom line of the sustainability argument (social, environmental, and economic) is not being fully addressed by fire management research. More than one-fourth of the studies had their objective related to testing new approaches in fire or forest management, suggesting that new knowledge is being produced on the field. Nevertheless, the results indicate that most studies (about 84%) employed quantitative research methods, and only 3% of the studies used research methods that tackled social issues or addressed expert and practitioner’s knowledge. Perhaps this lack of multidisciplinary studies is one of the factors hindering more progress from being made in terms of reducing wildfire occurrences and their impacts.

Keywords: wildfire, Mediterranean-climate regions, management, policy

Procedia PDF Downloads 111