Search results for: image encryption algorithms
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4685

Search results for: image encryption algorithms

185 Extra Skin Removal Surgery and Its Effects: A Comprehensive Review

Authors: Rebin Mzhda Mohammed, Hoshmand Ali Hama Agha

Abstract:

Excess skin, often consequential to substantial weight loss or the aging process, introduces physical discomfort, obstructs daily activities, and undermines an individual's self-esteem. As these challenges become increasingly prevalent, the need to explore viable solutions grows in significance. Extra skin removal surgery, colloquially known as body contouring surgery, has emerged as a compelling intervention to ameliorate the physical and psychological burdens of excess skin. This study undertakes a comprehensive review to illuminate the intricacies of extra skin removal surgery, encompassing its diverse procedures, associated risks, benefits, and psychological implications on patients. The methodological approach adopted involves a systematic and exhaustive review of pertinent scholarly literature sourced from reputable databases, including PubMed, Google Scholar, and specialized cosmetic surgery journals. Articles are meticulously curated based on their relevance, credibility, and recency. Subsequently, data from these sources are synthesized and categorized, facilitating a comprehensive understanding of the subject matter. Qualitative analysis serves to unravel the nuanced psychological effects, while quantitative data, where available, are harnessed to underpin the study's conclusions. In terms of major findings, the research underscores the manifold advantages of extra skin removal surgery. Patients experience a notable improvement in physical comfort, amplified mobility, enhanced self-confidence, and a newfound ability to don clothing comfortably. Nonetheless, the benefits are juxtaposed with potential risks, encompassing infection, scarring, hematoma, delayed healing, and the challenge of achieving symmetry. A salient discovery is the profound psychological impact of the surgery, as patients consistently report elevated body image satisfaction, heightened self-esteem, and a substantial enhancement in overall quality of life. In summation, this research accentuates the pivotal role of extra skin removal surgery in ameliorating the intricate interplay of physical and psychological difficulties posed by excess skin. By elucidating the diverse procedures, associated risks, and psychological outcomes, the study contributes to a comprehensive and informed comprehension of the surgery's multifaceted effects. Therefore, individuals contemplating this transformative surgical option are equipped with comprehensive insights, ultimately fostering informed decision-making, guided by the expertise of medical professionals.

Keywords: extra skin removal surgery, body contouring, abdominoplasty, brachioplasty, thigh lift, body lift, benefits, risks, psychological effects

Procedia PDF Downloads 66
184 Preparation of Papers - Developing a Leukemia Diagnostic System Based on Hybrid Deep Learning Architectures in Actual Clinical Environments

Authors: Skyler Kim

Abstract:

An early diagnosis of leukemia has always been a challenge to doctors and hematologists. On a worldwide basis, it was reported that there were approximately 350,000 new cases in 2012, and diagnosing leukemia was time-consuming and inefficient because of an endemic shortage of flow cytometry equipment in current clinical practice. As the number of medical diagnosis tools increased and a large volume of high-quality data was produced, there was an urgent need for more advanced data analysis methods. One of these methods was the AI approach. This approach has become a major trend in recent years, and several research groups have been working on developing these diagnostic models. However, designing and implementing a leukemia diagnostic system in real clinical environments based on a deep learning approach with larger sets remains complex. Leukemia is a major hematological malignancy that results in mortality and morbidity throughout different ages. We decided to select acute lymphocytic leukemia to develop our diagnostic system since acute lymphocytic leukemia is the most common type of leukemia, accounting for 74% of all children diagnosed with leukemia. The results from this development work can be applied to all other types of leukemia. To develop our model, the Kaggle dataset was used, which consists of 15135 total images, 8491 of these are images of abnormal cells, and 5398 images are normal. In this paper, we design and implement a leukemia diagnostic system in a real clinical environment based on deep learning approaches with larger sets. The proposed diagnostic system has the function of detecting and classifying leukemia. Different from other AI approaches, we explore hybrid architectures to improve the current performance. First, we developed two independent convolutional neural network models: VGG19 and ResNet50. Then, using both VGG19 and ResNet50, we developed a hybrid deep learning architecture employing transfer learning techniques to extract features from each input image. In our approach, fusing the features from specific abstraction layers can be deemed as auxiliary features and lead to further improvement of the classification accuracy. In this approach, features extracted from the lower levels are combined into higher dimension feature maps to help improve the discriminative capability of intermediate features and also overcome the problem of network gradient vanishing or exploding. By comparing VGG19 and ResNet50 and the proposed hybrid model, we concluded that the hybrid model had a significant advantage in accuracy. The detailed results of each model’s performance and their pros and cons will be presented in the conference.

Keywords: acute lymphoblastic leukemia, hybrid model, leukemia diagnostic system, machine learning

Procedia PDF Downloads 187
183 Innovative Food Related Modification of the Day-Night Task Demonstrates Impaired Inhibitory Control among Patients with Binge-Purge Eating Disorder

Authors: Sigal Gat-Lazer, Ronny Geva, Dan Ramon, Eitan Gur, Daniel Stein

Abstract:

Introduction: Eating disorders (ED) are common psychopathologies which involve distorted body image and eating disturbances. Binge-purge eating disorders (B/P ED) are characterized by repetitive events of binge eating followed by purges. Patients with B/P ED behavior may be seen as impulsive especially when relate to food stimulation and affective conditions. The current study included innovative modification of the day-night task targeted to assess inhibitory control among patients with B/P ED. Methods: This prospective study included 50 patients with B/P ED during acute phase of illness (T1) upon their admission to specialized ED department in tertiary center. 34 patients repeated the study towards discharge to ambulatory care (T2). Treatment effect was evaluated by BMI and emotional questionnaires regarding depression and anxiety by the Beck Depression Inventory and State Trait Anxiety Inventory questionnaires. Control group included 36 healthy controls with matched demographic parameters who performed both T1 and T2 assessments. The current modification is based on the emotional day-night task (EDNT) which involves five emotional stimulation added to the sun and moon pictures presented to participants. In the current study, we designed the food-emotional modification day night task (F-EDNT) food stimulations of egg and banana which resemble the sun and moon, respectively, in five emotional states (angry, sad, happy, scrambled and neutral). During this computerized task, participants were instructed to push on “day” bottom in response to moon and banana stimulations and on “night” bottom when sun and egg were presented. Accuracy (A) and reaction time (RT) were evaluated and compared between EDNT and F-EDNT as a reflection of participants’ inhibitory control. Results: Patients with B/P ED had significantly improved BMI, depression and anxiety scores on T2 compared to T1 (all p<0.001). Task performance was similar among patients and controls in the EDNT without significant A or RT differences in both T1 and T2. On F-EDNT during T1, B/P ED patients had significantly reduced accuracy in 4/5 emotional stimulation compared to controls: angry (73±25% vs. 84±15%, respectively), sad (69±25% vs. 80±18%, respectively), happy (73±24% vs. 82±18%, respectively) and scrambled (74±24% vs. 84±13%, respectively, all p<0.05). Additionally, patients’ RT to food stimuli was significantly faster compared to neutral ones, in both cry and neutral emotional stimulations (356±146 vs. 400±141 and 378±124 vs. 412±116 msec, respectively, p<0.05). These significant differences between groups as a function of stimulus type were diminished on T2. Conclusion: Having to process food related content, in particular in emotional context seems to be impaired in patients with B/P ED during the acute phase of their illness and elicits greater impulsivity. Innovative modification using such procedures seem to be sensitive to patients’ illness phase and thus may be implemented during screening and follow up through the clinical management of these patients.

Keywords: binge purge eating disorders, day night task modification, eating disorders, food related stimulations

Procedia PDF Downloads 381
182 Characterization of Thin Woven Composites Used in Printed Circuit Boards by Combining Numerical and Experimental Approaches

Authors: Gautier Girard, Marion Martiny, Sebastien Mercier, Mohamad Jrad, Mohamed-Slim Bahi, Laurent Bodin, Francois Lechleiter, David Nevo, Sophie Dareys

Abstract:

Reliability of electronic devices has always been of highest interest for Aero-MIL and space applications. In any electronic device, Printed Circuit Board (PCB), providing interconnection between components, is a key for reliability. During the last decades, PCB technologies evolved to sustain and/or fulfill increased original equipment manufacturers requirements and specifications, higher densities and better performances, faster time to market and longer lifetime, newer material and mixed buildups. From the very beginning of the PCB industry up to recently, qualification, experiments and trials, and errors were the most popular methods to assess system (PCB) reliability. Nowadays OEM, PCB manufacturers and scientists are working together in a close relationship in order to develop predictive models for PCB reliability and lifetime. To achieve that goal, it is fundamental to characterize precisely base materials (laminates, electrolytic copper, …), in order to understand failure mechanisms and simulate PCB aging under environmental constraints by means of finite element method for example. The laminates are woven composites and have thus an orthotropic behaviour. The in-plane properties can be measured by combining classical uniaxial testing and digital image correlation. Nevertheless, the out-of-plane properties cannot be evaluated due to the thickness of the laminate (a few hundred of microns). It has to be noted that the knowledge of the out-of-plane properties is fundamental to investigate the lifetime of high density printed circuit boards. A homogenization method combining analytical and numerical approaches has been developed in order to obtain the complete elastic orthotropic behaviour of a woven composite from its precise 3D internal structure and its experimentally measured in-plane elastic properties. Since the mechanical properties of the resin surrounding the fibres are unknown, an inverse method is proposed to estimate it. The methodology has been applied to one laminate used in hyperfrequency spatial applications in order to get its elastic orthotropic behaviour at different temperatures in the range [-55°C; +125°C]. Next; numerical simulations of a plated through hole in a double sided PCB are performed. Results show the major importance of the out-of-plane properties and the temperature dependency of these properties on the lifetime of a printed circuit board. Acknowledgements—The support of the French ANR agency through the Labcom program ANR-14-LAB7-0003-01, support of CNES, Thales Alenia Space and Cimulec is acknowledged.

Keywords: homogenization, orthotropic behaviour, printed circuit board, woven composites

Procedia PDF Downloads 204
181 CSR Communication Strategies: Stakeholder and Institutional Theories Perspective

Authors: Stephanie Gracelyn Rahaman, Chew Yin Teng, Manjit Singh Sandhu

Abstract:

Corporate scandals have made stakeholders apprehensive of large companies and expect greater transparency in CSR matters. However, companies find it challenging to strategically communicate CSR to intended stakeholders and in the process may fall short on maximizing on CSR efforts. Given that stakeholders have the ability to either reward good companies or take legal action or boycott against corporate brands who do not act socially responsible, companies must create shared understanding of their CSR activities. As a result, communication has become a strategy for many companies to demonstrate CSR engagement and to minimize stakeholder skepticism. The main objective of this research is to examine the types of CSR communication strategies and predictors that guide CSR communication strategies. Employing Morsing & Schultz’s guide on CSR communication strategies, the study integrates stakeholder and institutional theory to develop a conceptual framework. The conceptual framework hypothesized that stakeholder (instrumental and normative) and institutional (regulatory environment, nature of business, mimetic intention, CSR focus and corporate objectives) dimensions would drive CSR communication strategies. Preliminary findings from semi-structured interviews in Malaysia are consistent with the conceptual model in that stakeholder and institutional expectations guide CSR communication strategies. Findings show that most companies use two-way communication strategies. Companies that identified employees, the public or customers as key stakeholders have started to embrace social media to be in-sync with new trends of communication. This is especially with the Gen Y which is their priority. Some companies creatively use multiple communication channels because they recognize different stakeholders favor different communication channels. Therefore, it appears that companies use two-way communication strategies to complement the perceived limitation of one-way communication strategies as some companies prefer a more interactive platform to strategically engage stakeholders in CSR communication. In addition to stakeholders, institutional expectations also play a vital role in influencing CSR communication. Due to industry peer pressures, corporate objectives (attract international investors and customers), companies may be more driven to excel in social performance. For these reasons companies tend to go beyond the basic mandatory requirement, excel in CSR activities and be known as companies that champion CSR. In conclusion, companies use more two-way than one-way communication and companies use a combination of one and two-way communication to target different stakeholders resulting from stakeholder and institutional dimensions. Finally, in order to find out if the conceptual framework actually fits the Malaysian context, companies’ responses for expected organizational outcomes from communicating CSR were gathered from the interview transcripts. Thereafter, findings are presented to show some of the key organizational outcomes (visibility and brand recognition, portray responsible image, attract prospective employees, positive word-of-mouth, etc.) that companies in Malaysia expect from CSR communication. Based on these findings the conceptual framework has been refined to show the new identified organizational outcomes.

Keywords: CSR communication, CSR communication strategies, stakeholder theory, institutional theory, conceptual framework, Malaysia

Procedia PDF Downloads 289
180 Verification of Low-Dose Diagnostic X-Ray as a Tool for Relating Vital Internal Organ Structures to External Body Armour Coverage

Authors: Natalie A. Sterk, Bernard van Vuuren, Petrie Marais, Bongani Mthombeni

Abstract:

Injuries to the internal structures of the thorax and abdomen remain a leading cause of death among soldiers. Body armour is a standard issue piece of military equipment designed to protect the vital organs against ballistic and stab threats. When configured for maximum protection, the excessive weight and size of the armour may limit soldier mobility and increase physical fatigue and discomfort. Providing soldiers with more armour than necessary may, therefore, hinder their ability to react rapidly in life-threatening situations. The capability to determine the optimal trade-off between the amount of essential anatomical coverage and hindrance on soldier performance may significantly enhance the design of armour systems. The current study aimed to develop and pilot a methodology for relating internal anatomical structures with actual armour plate coverage in real-time using low-dose diagnostic X-ray scanning. Several pilot scanning sessions were held at Lodox Systems (Pty) Ltd head-office in South Africa. Testing involved using the Lodox eXero-dr to scan dummy trunk rigs at various degrees and heights of measurement; as well as human participants, wearing correctly fitted body armour while positioned in supine, prone shooting, seated and kneeling shooting postures. The verification of sizing and metrics obtained from the Lodox eXero-dr were then confirmed through a verification board with known dimensions. Results indicated that the low-dose diagnostic X-ray has the capability to clearly identify the vital internal structures of the aortic arch, heart, and lungs in relation to the position of the external armour plates. Further testing is still required in order to fully and accurately identify the inferior liver boundary, inferior vena cava, and spleen. The scans produced in the supine, prone, and seated postures provided superior image quality over the kneeling posture. The X-ray-source and-detector distance from the object must be standardised to control for possible magnification changes and for comparison purposes. To account for this, specific scanning heights and angles were identified to allow for parallel scanning of relevant areas. The low-dose diagnostic X-ray provides a non-invasive, safe, and rapid technique for relating vital internal structures with external structures. This capability can be used for the re-evaluation of anatomical coverage required for essential protection while optimising armour design and fit for soldier performance.

Keywords: body armour, low-dose diagnostic X-ray, scanning, vital organ coverage

Procedia PDF Downloads 123
179 Understanding Project Failures in Construction: The Critical Impact of Financial Capacity

Authors: Nnadi Ezekiel Oluwaseun Ejiofor

Abstract:

This research investigates the effects of poor cost estimation, material cost variations, and payment punctuality on the financial health and execution of construction projects in Nigeria. To achieve the objectives of the study, a quantitative research approach was employed, and data was gathered through an online survey of 74 construction industry professionals consisting of quantity surveyors, contractors, and other professionals. The study surveyed input on cost estimation errors, price fluctuations, and payment delays, among other factors. The responses of the respondents were analyzed using a five-point Likert scale and the Relative Importance Index (RII). The findings demonstrated that the errors in cost estimating in the Bill of Quantity (BOQ) have a high degree of negative impact on the reputation and image of the participants in the projects. The greatest effect was experienced on the likelihood of obtaining future endeavors for contractors (mean value = 3.42), followed by the likelihood of obtaining new commissions by quantity surveyors (mean value = 3.40). The level of inaccuracy in costing that undershoots exposes them to risks was most serious in terms of easement of construction and effects of shortage of funds to pursue bankruptcy (hence fears of mean value = 3.78). There was also considerable financial damage as a result of cost underestimation, whereby contractors suffered the worst loss in profit (mean value = 3.88). Every expense comes with its own peculiar risk and uncertainty. Pressure on the cost of materials and every other expense attributed to the building and completion of a structure adds risks to the performance figures of a project. The greatest weight (mean importance score = 4.92) was attributed to issues like market inflation in building materials, while the second greatest weight (mean importance score = 4.76) was due to increased transportation charges. On the other hand, delays in payments arising from issues of the clients like poor availability of funds (RII=0.71) and contracting issues such as disagreements on the valuation of works done (RII=0.72) or other reasons were also found to lead to project delays and additional costs. The results affirm the importance of proper cost estimation on the health of organization finances and project risks and finishes within set time limits. As for the suggestions, it is proposed to progress on the methods of costing, engender better communications with the stakeholders, and manage the delays by way of contracting and financial control. This study enhances the existing literature on construction project management by suggesting ways to deal with adverse cost inaccuracies and availability of materials due to delays in payments which, if addressed, would greatly improve the economic performance of the construction business.

Keywords: cost estimation, construction project management, material price fluctuations, payment delays, financial impact

Procedia PDF Downloads 9
178 Articulating the Colonial Relation, a Conversation between Afropessimism and Anti-Colonialism

Authors: Thomas Compton

Abstract:

As Decolonialism becomes an important topic in Political Theory, the rupture between the colonized and the colonist relation has lost attention. Focusing on the anti-colonial activist Madhi Amel, we shall consider his attention to the permanence of the colonial relation and how it preempts Frank Wilderson’s formulation of (white) culturally necessary Anti-Black violence. Both projects draw attention away from empirical accounts of oppression, instead focusing on the structural relation which precipitates them. As Amel says that we should stop thinking of the ‘underdeveloped’ as beyond the colonial relation, Wilderson says we should stop think of the Black rights that have surpassed the role of the slave. However, Amel moves beyond his idol Althusser’s Structuralism toward a formulation of the colonial relation as source of domination. Our analysis will take a Lacanian turn in considering how this non-relation was formulated as a relation how this space of negativity became a ideological opportunity for Colonial domination. Wilderson’s work shall problematise this as we conclude with his criticisms of Structural accounts for the failure to consider how Black social death exists as more than necessity but a cite of white desire. Amel, a Lebanese activist and scholar (re)discovered by Hicham Safieddine, argues colonialism is more than the theft of land, but instead a privatization of collective property and form of investment which (re)produces the status of the capitalist in spaces ‘outside’ the market. Although Amel was a true Marxist-Leninsist, who exposited the economic determinacy of the Colonial Mode of Production, we are reading this account through Alenka Zupančič’s reformulation of the ‘invisible hand job of the market’. Amel points to the signifier ‘underdeveloped’ as buttressed on a pre-colonial epistemic break, as the Western investor (debt collector) sees the (post?) colony narcissistic image. However, the colony can never become site of class conflict, as the workers are not unified but existing between two countries. In industry, they are paid in Colonial subjectivisation, the promise of market (self)pleasure, at home, they are refugees. They are not, as Wilderson states, in the permanent social death of the slave, but they are less than the white worker. This is formulated as citizen (white), non-citizen (colonized), anti-citizen (Black/slave). Here we may also think of how indentured Indians were used as instruments of colonial violence. Wilderson’s aphorism “there is no analogy to anti-Black violence” lays bare his fundamental opposition between colonial and specifically anti-Black violence. It is not only that the debt collector, landowner, or other owners of production pleasures themselves as if their hand is invisible. The absolute negativity between colony and colonized provides a new frontier for desire, the development of a colonial mode of production. An invention inside the colonial structure that is generative of class substitution. We shall explore how Amel ignores the role of the slave but how Wilderson forecloses the history African anti-colonial.

Keywords: afropessimism, fanon, marxism, postcolonialism

Procedia PDF Downloads 154
177 Epidemiological Patterns of Pediatric Fever of Unknown Origin

Authors: Arup Dutta, Badrul Alam, Sayed M. Wazed, Taslima Newaz, Srobonti Dutta

Abstract:

Background: In today's world, with modern science and contemporary technology, a lot of diseases may be quickly identified and ruled out, but children's fever of unknown origin (FUO) still presents diagnostic difficulties in clinical settings. Any fever that reaches 38 °C and lasts for more than seven days without a known cause is now classified as a fever of unknown origin (FUO). Despite tremendous progress in the medical sector, fever of unknown origin, or FOU, persists as a major health issue and a major contributor to morbidity and mortality, particularly in children, and its spectrum is sometimes unpredictable. The etiology is influenced by geographic location, age, socioeconomic level, frequency of antibiotic resistance, and genetic vulnerability. Since there are currently no known diagnostic algorithms, doctors are forced to evaluate each patient one at a time with extreme caution. A persistent fever poses difficulties for both the patient and the doctor. This prospective observational study was carried out in a Bangladeshi tertiary care hospital from June 2018 to May 2019 with the goal of identifying the epidemiological patterns of fever of unknown origin in pediatric patients. Methods: It was a hospital-based prospective observational study carried out on 106 children (between 2 months and 12 years) with prolonged fever of >38.0 °C lasting for more than 7 days without a clear source. Children with additional chronic diseases or known immunodeficiency problems were not allowed. Clinical practices that helped determine the definitive etiology were assessed. Initial testing included a complete blood count, a routine urine examination, PBF, a chest X-ray, CRP measurement, blood cultures, serology, and additional pertinent investigations. The analysis focused mostly on the etiological results. The standard program SPSS 21 was used to analyze all of the study data. Findings: A total of 106 patients identified as having FUO were assessed, with over half (57.5%) being female and the majority (40.6%) falling within the 1 to 3-year age range. The study categorized the etiological outcomes into five groups: infections, malignancies, connective tissue conditions, miscellaneous, and undiagnosed. In the group that was being studied, infections were found to be the main cause in 44.3% of cases. Undiagnosed cases came in at 31.1%, cancers at 10.4%, other causes at 8.5%, and connective tissue disorders at 4.7%. Hepato-splenomegaly was seen in people with enteric fever, malaria, acute lymphoid leukemia, lymphoma, and hepatic abscesses, either by itself or in combination with other conditions. About 53% of people who were not diagnosed also had hepato-splenomegaly at the same time. Conclusion: Infections are the primary cause of PUO (pyrexia of unknown origin) in children, with undiagnosed cases being the second most common cause. An incremental approach is beneficial in the process of diagnosing a condition. Non-invasive examinations are used to diagnose infections and connective tissue disorders, while invasive investigations are used to diagnose cancer and other ailments. According to this study, the prevalence of undiagnosed diseases is still remarkable, so extensive historical analysis and physical examinations are necessary in order to provide a precise diagnosis.

Keywords: children, diagnostic challenges, fever of unknown origin, pediatric fever, undiagnosed diseases

Procedia PDF Downloads 28
176 Evaluation of Differential Interaction between Flavanols and Saliva Proteins by Diffusion and Precipitation Assays on Cellulose Membranes

Authors: E. Obreque-Slier, V. Contreras-Cortez, R. López-Solís

Abstract:

Astringency is a drying, roughing, and sometimes puckering sensation that is experienced on the various oral surfaces during or immediately after tasting foods. This sensation has been closely related to the interaction and precipitation between salivary proteins and polyphenols, specifically flavanols or proanthocyanidins. In addition, the type and concentration of proanthocyanidin influences significantly the intensity of the astringency and consequently the protein/proanthocyanidin interaction. However, most of the studies are based on the interaction between saliva and highly complex polyphenols, without considering the effect of monomeric proanthoancyanidins present in different foods. The aim of this study was to evaluate the effect of different monomeric proanthocyanidins on the diffusion and precipitation of salivary proteins. Thus, solutions of catechin, epicatechin, epigallocatechin and gallocatechin (0, 2.0, 4.0, 6.0, 8.0 and 10 mg/mL) were mixed with human saliva (1: 1 v/v). After incubation for 5 min at room temperature, 15 µL aliquots of each mix were dotted on a cellulose membrane and allowed to dry spontaneously at room temperature. The membrane was fixed, rinsed and stained for proteins with Coomassie blue. After exhaustive washing in 7% acetic acid, the membrane was rinsed once in distilled water and dried under a heat lamp. Both diffusion area and stain intensity of the protein spots were semiqualitative estimates for protein-tannin interaction (diffusion test). The rest of the whole saliva-phenol solution mixtures of the diffusion assay were centrifuged, and 15-μL aliquots from each of the supernatants were dotted on a cellulose membrane. The membrane was processed for protein staining as indicated above. The blue-stained area of protein distribution corresponding to each of the extract dilution-saliva mixtures was quantified by Image J 1.45 software. Each of the assays was performed at least three times. Initially, salivary proteins display a biphasic distribution on cellulose membranes, that is, when aliquots of saliva are placed on absorbing cellulose membranes, and free diffusion of saliva is allowed to occur, a non-diffusible protein fraction becomes surrounded by highly diffusible salivary proteins. In effect, once diffusion has ended, a protein-binding dye shows an intense blue-stained roughly circular area close to the spotting site (non-diffusible fraction) (NDF) which becomes surrounded by a weaker blue-stained outer band (diffusible fraction) (DF). Likewise, the diffusion test showed that epicatechin caused the complete disappearance of DF from saliva with 2 mg/mL. Also, epigallocatechin and gallocatechin caused a similar effect with 4 mg/mL, while catechin generated the same effect at 8 mg/mL. In the precipitation test, the use of epicatechin and gallocatechin generated evident precipitates at the bottom of the Eppendorf tubes. In summary, the flavanol type differentially affects the diffusion and precipitation of saliva, which would affect the sensation of astringency perceived by consumers.

Keywords: astringency, polyphenols, tannins, tannin-protein interaction

Procedia PDF Downloads 201
175 The Effect of Regulation and Investment in Sustainable Practices on Environmental Performance and Consumer Trust: a Time Series Analysis of the Dominant Companies within the Energy Sector

Authors: Sempiga Olivier, Dominika Latusek-Jurczak

Abstract:

Climate change has allegedly been attributed to a high consumption of fossil fuels, leading to severe environmental problems. The energy sector has been among the most polluting sectors for many decades. Consequently, there is a lack of trust in several energy firms, especially those in fossil fuels and nuclear energy. A robust regulatory framework is needed, and more investment in renewable energy sources is paramount for a better environmental outcome. Given the significant environmental impact of energy production and consumption in the energy sector, sustainable marketing practices have become increasingly important. Although the latter has had the lion’s share in polluting the environment, much effort has been made recently to move away from fossil fuels and privilege renewable energy sources. How this shift would help rebuild trust in the energy industry is unclear. For the shift to have lasting effects, it may be essential that regulatory agencies examine how energy firms engage in sustainable investment. There is little empirical evidence on whether adopting regulating marketing practices and investment initiatives can help different organizations reduce their environmental impact and promote sustainable development. Little is known about how and whether the environmental value in firms goes beyond rhetoric, greenwashing and publicity to translate into economic gains and environmental performance. The study investigates how regulatory agencies can help energy firms invest sustainably and take sustainable initiatives even amid the energy crisis caused by the Russia-Ukraine conflict and how these sustainable practices relate to renewed consumer trust. Using data from Corporate Knights, the study, through time series, analyses the relationship between sustainable regulation, sustainable practices of energy firms from around the world and their relation to consumer trust and environmental performance over the past 20 years. It examines how their sustainable investment, energy, and carbon productivity relate to environmental sustainability and consumer trust. This longitudinal study provides empirical evidence of the interplay between regulation, trust and environmental performance. The research is grounded in institutional trust theory, which emphasizes the role of regulatory frameworks and organizational practices in shaping public perceptions of fairness, transparency, and legitimacy. Results show that organizations in the energy sector, supported by robust regulatory tools, can overcome the negative image of polluters and compete with other companies in the fight against climate change and global warming. However, to do so, energy firms should consider investing more in renewable energy sources and implementing sustainable strategies and practices that go beyond greenwashing to improve their environmental performance, thereby rebuilding consumer trust in the energy sector. Results allow regulatory regimes and organizations to learn why it is crucial for energy firms to invest in renewable energy sources and engage in various sustainable initiatives and practices to contribute to better environmental outcomes and higher levels of trust.

Keywords: consumer trust, energy, environmental performance, regulation, renewable energy sources, sustainable practices

Procedia PDF Downloads 9
174 Efficacy of Deep Learning for Below-Canopy Reconstruction of Satellite and Aerial Sensing Point Clouds through Fractal Tree Symmetry

Authors: Dhanuj M. Gandikota

Abstract:

Sensor-derived three-dimensional (3D) point clouds of trees are invaluable in remote sensing analysis for the accurate measurement of key structural metrics, bio-inventory values, spatial planning/visualization, and ecological modeling. Machine learning (ML) holds the potential in addressing the restrictive tradeoffs in cost, spatial coverage, resolution, and information gain that exist in current point cloud sensing methods. Terrestrial laser scanning (TLS) remains the highest fidelity source of both canopy and below-canopy structural features, but usage is limited in both coverage and cost, requiring manual deployment to map out large, forested areas. While aerial laser scanning (ALS) remains a reliable avenue of LIDAR active remote sensing, ALS is also cost-restrictive in deployment methods. Space-borne photogrammetry from high-resolution satellite constellations is an avenue of passive remote sensing with promising viability in research for the accurate construction of vegetation 3-D point clouds. It provides both the lowest comparative cost and the largest spatial coverage across remote sensing methods. However, both space-borne photogrammetry and ALS demonstrate technical limitations in the capture of valuable below-canopy point cloud data. Looking to minimize these tradeoffs, we explored a class of powerful ML algorithms called Deep Learning (DL) that show promise in recent research on 3-D point cloud reconstruction and interpolation. Our research details the efficacy of applying these DL techniques to reconstruct accurate below-canopy point clouds from space-borne and aerial remote sensing through learned patterns of tree species fractal symmetry properties and the supplementation of locally sourced bio-inventory metrics. From our dataset, consisting of tree point clouds obtained from TLS, we deconstructed the point clouds of each tree into those that would be obtained through ALS and satellite photogrammetry of varying resolutions. We fed this ALS/satellite point cloud dataset, along with the simulated local bio-inventory metrics, into the DL point cloud reconstruction architectures to generate the full 3-D tree point clouds (the truth values are denoted by the full TLS tree point clouds containing the below-canopy information). Point cloud reconstruction accuracy was validated both through the measurement of error from the original TLS point clouds as well as the error of extraction of key structural metrics, such as crown base height, diameter above root crown, and leaf/wood volume. The results of this research additionally demonstrate the supplemental performance gain of using minimum locally sourced bio-inventory metric information as an input in ML systems to reach specified accuracy thresholds of tree point cloud reconstruction. This research provides insight into methods for the rapid, cost-effective, and accurate construction of below-canopy tree 3-D point clouds, as well as the supported potential of ML and DL to learn complex, unmodeled patterns of fractal tree growth symmetry.

Keywords: deep learning, machine learning, satellite, photogrammetry, aerial laser scanning, terrestrial laser scanning, point cloud, fractal symmetry

Procedia PDF Downloads 103
173 An Integrated Lightweight Naïve Bayes Based Webpage Classification Service for Smartphone Browsers

Authors: Mayank Gupta, Siba Prasad Samal, Vasu Kakkirala

Abstract:

The internet world and its priorities have changed considerably in the last decade. Browsing on smart phones has increased manifold and is set to explode much more. Users spent considerable time browsing different websites, that gives a great deal of insight into user’s preferences. Instead of plain information classifying different aspects of browsing like Bookmarks, History, and Download Manager into useful categories would improve and enhance the user’s experience. Most of the classification solutions are server side that involves maintaining server and other heavy resources. It has security constraints and maybe misses on contextual data during classification. On device, classification solves many such problems, but the challenge is to achieve accuracy on classification with resource constraints. This on device classification can be much more useful in personalization, reducing dependency on cloud connectivity and better privacy/security. This approach provides more relevant results as compared to current standalone solutions because it uses content rendered by browser which is customized by the content provider based on user’s profile. This paper proposes a Naive Bayes based lightweight classification engine targeted for a resource constraint devices. Our solution integrates with Web Browser that in turn triggers classification algorithm. Whenever a user browses a webpage, this solution extracts DOM Tree data from the browser’s rendering engine. This DOM data is a dynamic, contextual and secure data that can’t be replicated. This proposal extracts different features of the webpage that runs on an algorithm to classify into multiple categories. Naive Bayes based engine is chosen in this solution for its inherent advantages in using limited resources compared to other classification algorithms like Support Vector Machine, Neural Networks, etc. Naive Bayes classification requires small memory footprint and less computation suitable for smartphone environment. This solution has a feature to partition the model into multiple chunks that in turn will facilitate less usage of memory instead of loading a complete model. Classification of the webpages done through integrated engine is faster, more relevant and energy efficient than other standalone on device solution. This classification engine has been tested on Samsung Z3 Tizen hardware. The Engine is integrated into Tizen Browser that uses Chromium Rendering Engine. For this solution, extensive dataset is sourced from dmoztools.net and cleaned. This cleaned dataset has 227.5K webpages which are divided into 8 generic categories ('education', 'games', 'health', 'entertainment', 'news', 'shopping', 'sports', 'travel'). Our browser integrated solution has resulted in 15% less memory usage (due to partition method) and 24% less power consumption in comparison with standalone solution. This solution considered 70% of the dataset for training the data model and the rest 30% dataset for testing. An average accuracy of ~96.3% is achieved across the above mentioned 8 categories. This engine can be further extended for suggesting Dynamic tags and using the classification for differential uses cases to enhance browsing experience.

Keywords: chromium, lightweight engine, mobile computing, Naive Bayes, Tizen, web browser, webpage classification

Procedia PDF Downloads 163
172 Time-Domain Nuclear Magnetic Resonance as a Potential Analytical Tool to Assess Thermisation in Ewe's Milk

Authors: Alessandra Pardu, Elena Curti, Marco Caredda, Alessio Dedola, Margherita Addis, Massimo Pes, Antonio Pirisi, Tonina Roggio, Sergio Uzzau, Roberto Anedda

Abstract:

Some of the artisanal cheeses products of European Countries certificated as PDO (Protected Designation of Origin) are made from raw milk. To recognise potential frauds (e.g. pasteurisation or thermisation of milk aimed at raw milk cheese production), the alkaline phosphatase (ALP) assay is currently applied only for pasteurisation, although it is known to have notable limitations for the validation of ALP enzymatic state in nonbovine milk. It is known that frauds considerably impact on customers and certificating institutions, sometimes resulting in a damage of the product image and potential economic losses for cheesemaking producers. Robust, validated, and univocal analytical methods are therefore needed to allow Food Control and Security Organisms, to recognise a potential fraud. In an attempt to develop a new reliable method to overcome this issue, Time-Domain Nuclear Magnetic Resonance (TD-NMR) spectroscopy has been applied in the described work. Daily fresh milk was analysed raw (680.00 µL in each 10-mm NMR glass tube) at least in triplicate. Thermally treated samples were also produced, by putting each NMR tube of fresh raw milk in water pre-heated at temperatures from 68°C up to 72°C and for up to 3 min, with continuous agitation, and quench-cooled to 25°C in a water and ice solution. Raw and thermally treated samples were analysed in terms of 1H T2 transverse relaxation times with a CPMG sequence (Recycle Delay: 6 s, interpulse spacing: 0.05 ms, 8000 data points) and quasi-continuous distributions of T2 relaxation times were obtained by CONTIN analysis. In line with previous data collected by high field NMR techniques, a decrease in the spin-spin relaxation constant T2 of the predominant 1H population was detected in heat-treated milk as compared to raw milk. The decrease of T2 parameter is consistent with changes in chemical exchange and diffusive phenomena, likely associated to changes in milk protein (i.e. whey proteins and casein) arrangement promoted by heat treatment. Furthermore, experimental data suggest that molecular alterations are strictly dependent on the specific heat treatment conditions (temperature/time). Such molecular variations in milk, which are likely transferred to cheese during cheesemaking, highlight the possibility to extend the TD-NMR technique directly on cheese to develop a method for assessing a fraud related to the use of a milk thermal treatment in PDO raw milk cheese. Results suggest that TDNMR assays might pave a new way to the detailed characterisation of heat treatments of milk.

Keywords: cheese fraud, milk, pasteurisation, TD-NMR

Procedia PDF Downloads 243
171 Stable Diffusion, Context-to-Motion Model to Augmenting Dexterity of Prosthetic Limbs

Authors: André Augusto Ceballos Melo

Abstract:

Design to facilitate the recognition of congruent prosthetic movements, context-to-motion translations guided by image, verbal prompt, users nonverbal communication such as facial expressions, gestures, paralinguistics, scene context, and object recognition contributes to this process though it can also be applied to other tasks, such as walking, Prosthetic limbs as assistive technology through gestures, sound codes, signs, facial, body expressions, and scene context The context-to-motion model is a machine learning approach that is designed to improve the control and dexterity of prosthetic limbs. It works by using sensory input from the prosthetic limb to learn about the dynamics of the environment and then using this information to generate smooth, stable movements. This can help to improve the performance of the prosthetic limb and make it easier for the user to perform a wide range of tasks. There are several key benefits to using the context-to-motion model for prosthetic limb control. First, it can help to improve the naturalness and smoothness of prosthetic limb movements, which can make them more comfortable and easier to use for the user. Second, it can help to improve the accuracy and precision of prosthetic limb movements, which can be particularly useful for tasks that require fine motor control. Finally, the context-to-motion model can be trained using a variety of different sensory inputs, which makes it adaptable to a wide range of prosthetic limb designs and environments. Stable diffusion is a machine learning method that can be used to improve the control and stability of movements in robotic and prosthetic systems. It works by using sensory feedback to learn about the dynamics of the environment and then using this information to generate smooth, stable movements. One key aspect of stable diffusion is that it is designed to be robust to noise and uncertainty in the sensory feedback. This means that it can continue to produce stable, smooth movements even when the sensory data is noisy or unreliable. To implement stable diffusion in a robotic or prosthetic system, it is typically necessary to first collect a dataset of examples of the desired movements. This dataset can then be used to train a machine learning model to predict the appropriate control inputs for a given set of sensory observations. Once the model has been trained, it can be used to control the robotic or prosthetic system in real-time. The model receives sensory input from the system and uses it to generate control signals that drive the motors or actuators responsible for moving the system. Overall, the use of the context-to-motion model has the potential to significantly improve the dexterity and performance of prosthetic limbs, making them more useful and effective for a wide range of users Hand Gesture Body Language Influence Communication to social interaction, offering a possibility for users to maximize their quality of life, social interaction, and gesture communication.

Keywords: stable diffusion, neural interface, smart prosthetic, augmenting

Procedia PDF Downloads 101
170 The Digital Microscopy in Organ Transplantation: Ergonomics of the Tele-Pathological Evaluation of Renal, Liver, and Pancreatic Grafts

Authors: Constantinos S. Mammas, Andreas Lazaris, Adamantia S. Mamma-Graham, Georgia Kostopanagiotou, Chryssa Lemonidou, John Mantas, Eustratios Patsouris

Abstract:

The process to build a better safety culture, methods of error analysis, and preventive measures, starts with an understanding of the effects when human factors engineering refer to remote microscopic diagnosis in surgery and specially in organ transplantation for the evaluation of the grafts. Α high percentage of solid organs arrive at the recipient hospitals and are considered as injured or improper for transplantation in the UK. Digital microscopy adds information on a microscopic level about the grafts (G) in Organ Transplant (OT), and may lead to a change in their management. Such a method will reduce the possibility that a diseased G will arrive at the recipient hospital for implantation. Aim: The aim of this study is to analyze the ergonomics of digital microscopy (DM) based on virtual slides, on telemedicine systems (TS) for tele-pathological evaluation (TPE) of the grafts (G) in organ transplantation (OT). Material and Methods: By experimental simulation, the ergonomics of DM for microscopic TPE of renal graft (RG), liver graft (LG) and pancreatic graft (PG) tissues is analyzed. In fact, this corresponded to the ergonomics of digital microscopy for TPE in OT by applying virtual slide (VS) system for graft tissue image capture, for remote diagnoses of possible microscopic inflammatory and/or neoplastic lesions. Experimentation included the development of an OTE-TS similar experimental telemedicine system (Exp.-TS) for simulating the integrated VS based microscopic TPE of RG, LG and PG Simulation of DM on TS based TPE performed by 2 specialists on a total of 238 human renal graft (RG), 172 liver graft (LG) and 108 pancreatic graft (PG) tissues digital microscopic images for inflammatory and neoplastic lesions on four electronic spaces of the four used TS. Results: Statistical analysis of specialist‘s answers about the ability to accurately diagnose the diseased RG, LG and PG tissues on the electronic space among four TS (A,B,C,D) showed that DM on TS for TPE in OT is elaborated perfectly on the ES of a desktop, followed by the ES of the applied Exp.-TS. Tablet and mobile-phone ES seem significantly risky for the application of DM in OT (p<.001). Conclusion: To make the largest reduction in errors and adverse events referring to the quality of the grafts, it will take application of human factors engineering to procurement, design, audit, and awareness-raising activities. Consequently, it will take an investment in new training, people, and other changes to management activities for DM in OT. The simulating VS based TPE with DM of RG, LG and PG tissues after retrieval, seem feasible and reliable and dependable on the size of the electronic space of the applied TS, for remote prevention of diseased grafts from being retrieved and/or sent to the recipient hospital and for post-grafting and pre-transplant planning.

Keywords: digital microscopy, organ transplantation, tele-pathology, virtual slides

Procedia PDF Downloads 281
169 Exploring Artistic Creation and Autoethnography in the Spatial Context of Geography

Authors: Sinem Tas

Abstract:

This research paper attempts to study the perspective of personal experience in relation to spatial dynamics and artistic outcomes within the realm of cultural identity. This article serves as a partial analysis within a broader PhD investigation that focuses on the cultural dynamics and political structures behind cultural identity through an autoethnography of narrative while presenting its correlation with artistic creation in the context of space and people. Focusing on the artistic/creative practice project AUTRUI, the primary goal is to analyse and understand the influence of personal experiences and culturally constructed identity as an artist in resulting in the compositional modality of the last image considering self-reflective experience. Referencing the works of Joyce Davidson and Christine Milligan - the scholars who emphasise the importance of emotion and spatial experience in geographical studies contribute to this work as they highlight the significance of emotion across various spatial scales in their work Embodying Emotion Sensing Space: Introducing Emotional Geographies (2004). Their perspective suggests that understanding emotions within different spatial contexts is crucial for comprehending human experiences and interactions with space. Incorporating the insights of scholars like Yi-Fu Tuan, particularly his seminal work Space and Place: The Perspective of Experience (1979), is important for creating an in-depth frame of geographical experience. Tuan's humanistic perspective on space and place provides a valuable theoretical framework for understanding the interplay between personal experiences and spatial contexts. A substantial contextualisation of the geopolitics of Turkey - the implications for national identity and cohesion - will be addressed by drawing an outline of the political and geographical frame as a methodological strategy to understand the dynamics behind this research. Besides the bibliographical reading, the methods used to study this relation are participatory observation, memory work along with memoir analysis, personal interviews, and discussion of photographs and news. The utilisation of the self as data requires the analysis of the written sources with personal engagement. By delving into written sources such as written communications or diaries as well as memoirs, the research gains a firsthand perspective, enriching the analytical depth of the study. Furthermore, the examination of photography and news articles serves as a valuable means of contextualising experiences from a journalist's background within specific geographical settings. The inclusion of interviews with close family members access provides firsthand perspectives and intimate insights rooted in shared experiences within similar geographical contexts, offering complementary insights and diversified viewpoints, enhancing the comprehensiveness of the investigation.

Keywords: art, autoethnography, place and space, Turkey

Procedia PDF Downloads 50
168 Dynamic EEG Desynchronization in Response to Vicarious Pain

Authors: Justin Durham, Chanda Rooney, Robert Mather, Mickie Vanhoy

Abstract:

The psychological construct of empathy is to understand a person’s cognitive perspective and experience the other person’s emotional state. Deciphering emotional states is conducive for interpreting vicarious pain. Observing others' physical pain activates neural networks related to the actual experience of pain itself. The study addresses empathy as a nonlinear dynamic process of simulation for individuals to understand the mental states of others and experience vicarious pain, exhibiting self-organized criticality. Such criticality follows from a combination of neural networks with an excitatory feedback loop generating bistability to resonate permutated empathy. Cortical networks exhibit diverse patterns of activity, including oscillations, synchrony and waves, however, the temporal dynamics of neurophysiological activities underlying empathic processes remain poorly understood. Mu rhythms are EEG oscillations with dominant frequencies of 8-13 Hz becoming synchronized when the body is relaxed with eyes open and when the sensorimotor system is in idle, thus, mu rhythm synchrony is expected to be highest in baseline conditions. When the sensorimotor system is activated either by performing or simulating action, mu rhythms become suppressed or desynchronize, thus, should be suppressed while observing video clips of painful injuries if previous research on mirror system activation holds. Twelve undergraduates contributed EEG data and survey responses to empathy and psychopathy scales in addition to watching consecutive video clips of sports injuries. Participants watched a blank, black image on a computer monitor before and after observing a video of consecutive sports injuries incidents. Each video condition lasted five-minutes long. A BIOPAC MP150 recorded EEG signals from sensorimotor and thalamocortical regions related to a complex neural network called the ‘pain matrix’. Physical and social pain are activated in this network to resonate vicarious pain responses to processing empathy. Five EEG single electrode locations were applied to regions measuring sensorimotor electrical activity in microvolts (μV) to monitor mu rhythms. EEG signals were sampled at a rate of 200 Hz. Mu rhythm desynchronization was measured via 8-13 Hz at electrode sites (F3 & F4). Data for each participant’s mu rhythms were analyzed via Fast Fourier Transformation (FFT) and multifractal time series analysis.

Keywords: desynchronization, dynamical systems theory, electroencephalography (EEG), empathy, multifractal time series analysis, mu waveform, neurophysiology, pain simulation, social cognition

Procedia PDF Downloads 283
167 Exploring the Relationship Between Past and Present Reviews: The Influence of User Generated Content on Future Hotel Guest Experience Perceptions

Authors: Sacha Joseph-Mathews, Leili Javadpour

Abstract:

In the tourism industry, hoteliers spend millions annually on marketing and positioning efforts for their respective hotels, all in an effort to create a specific image in the minds of the consumer. Yet despite extensive efforts to seduce potential hotel guests with sophisticated advertising messages generated by hotel entities, consumers continue to mistrust corporate branding, preferring instead to place their trust in the reviews of their consumer peers. In today’s complex and cluttered marketplace, online reviews can serve as a mediator for consumers who do not have actual knowledge and experiences with the brand, but are in the process of deciding whether or not to engage in a consumption exercise. Traditionally, consumers have used online reviews as a source of comfort and confirmation of a product/service’s positioning. But today, very few customers make any purchase decisions without first researching existing user reviews, making reviews more of a necessity, rather than a luxury in the purchase decision process. The influence of user generated content (UGC) is amplified in the tourism industry; as more than a third of potential hotel guests will not book a room without first reading a review. As corporate branding becomes less relevant and online reviews become more important, how much of the consumer’s stay expectations are being dictated by existing UGC? Moreover, as hotel guest experience a hotel through the lens of an existing review, how much of their stay and in turn their review, would have been influenced by those reviews that they read? Ultimately, there is the potential for UGC to dictate what potential guests will be most critical about, and or most focused on during their stay. If UGC is a stronger influencer in the purchase decision process than corporate branding, doesn’t it have the potential to dictate, the entire stay experience by influencing the expectations of the guest prior to them arriving on the property? For example, if a hotel is an eco-destination and they focus their branding on their website around sustainability and the retreat nature of the hotel. Yet, guest reviews constantly discuss how dissatisfactory the service and food was with no mention of nature or sustainability, will future reviews then focus primarily on the food? Using text analysis software to examine over 25,000 online reviews, we explore the extent to which new reviews are influenced by wording used in previous reviews for a hotel property, versus content generated by corporate positioning. Additionally, we investigate how distinct hotel related UGC is across different types of tourism destinations. Our findings suggest that UGC can have a greater impact on future reviews, than corporate branding and there is more cohesiveness across UGC of different types of hotel properties than anticipated. A model of User Generated Content Influence is presented and the managerial impact of the power of online reviews to trump corporate branding and shape future user experiences is discussed.

Keywords: user generated content, UGC, corporate branding, online reviews, hotels and tourism

Procedia PDF Downloads 94
166 Separating Landform from Noise in High-Resolution Digital Elevation Models through Scale-Adaptive Window-Based Regression

Authors: Anne M. Denton, Rahul Gomes, David W. Franzen

Abstract:

High-resolution elevation data are becoming increasingly available, but typical approaches for computing topographic features, like slope and curvature, still assume small sliding windows, for example, of size 3x3. That means that the digital elevation model (DEM) has to be resampled to the scale of the landform features that are of interest. Any higher resolution is lost in this resampling. When the topographic features are computed through regression that is performed at the resolution of the original data, the accuracy can be much higher, and the reported result can be adjusted to the length scale that is relevant locally. Slope and variance are calculated for overlapping windows, meaning that one regression result is computed per raster point. The number of window centers per area is the same for the output as for the original DEM. Slope and variance are computed by performing regression on the points in the surrounding window. Such an approach is computationally feasible because of the additive nature of regression parameters and variance. Any doubling of window size in each direction only takes a single pass over the data, corresponding to a logarithmic scaling of the resulting algorithm as a function of the window size. Slope and variance are stored for each aggregation step, allowing the reported slope to be selected to minimize variance. The approach thereby adjusts the effective window size to the landform features that are characteristic to the area within the DEM. Starting with a window size of 2x2, each iteration aggregates 2x2 non-overlapping windows from the previous iteration. Regression results are stored for each iteration, and the slope at minimal variance is reported in the final result. As such, the reported slope is adjusted to the length scale that is characteristic of the landform locally. The length scale itself and the variance at that length scale are also visualized to aid in interpreting the results for slope. The relevant length scale is taken to be half of the window size of the window over which the minimum variance was achieved. The resulting process was evaluated for 1-meter DEM data and for artificial data that was constructed to have defined length scales and added noise. A comparison with ESRI ArcMap was performed and showed the potential of the proposed algorithm. The resolution of the resulting output is much higher and the slope and aspect much less affected by noise. Additionally, the algorithm adjusts to the scale of interest within the region of the image. These benefits are gained without additional computational cost in comparison with resampling the DEM and computing the slope over 3x3 images in ESRI ArcMap for each resolution. In summary, the proposed approach extracts slope and aspect of DEMs at the lengths scales that are characteristic locally. The result is of higher resolution and less affected by noise than existing techniques.

Keywords: high resolution digital elevation models, multi-scale analysis, slope calculation, window-based regression

Procedia PDF Downloads 129
165 Iron Doping Enhanced Photocatalytic Nitrogen Fixation Performance of WO₃ with Three-Dimensionally Orderd Macroporous Structure

Authors: Xiaoling Ren, Guidong Yang

Abstract:

Ammonia, as one of the largest-volume industrial chemicals, is mostly produced by century-old Haber-Bosch process with extreme conditionsand high-cost. Under the circumstance, researchersarededicated in finding new ways to replace the Haber-Bosch process. Photocatalytic nitrogen fixation is a promising sustainable, clear and green strategy for ammonia synthesis, butit is still a big challenge due to the high activation energy for nitrogen. It is essential to develop an efficient photocatalyst for making this approach industrial application. Constructing chemisorption active sites through defect engineering can be defined as an effective and reliable means to improve nitrogen activation by forming the extraordinary coordination environment and electronic structure. Besides, the construction of three-dimensionally orderdmacroporous (3DOM) structured photocatalyst is considered to be one of effectivestrategiesto improve the activity due to it canincrease the diffusion rate of reactants in the interior, which isbeneficial to the mass transfer process of nitrogen molecules in photocatalytic nitrogen reduction. Herein, Fe doped 3DOM WO₃(Fe-3DOM WO₃) without noble metal cocatalysts is synthesized by a polystyrene-template strategy, which is firstly used for photocatalytic nitrogen fixation. To elucidate the chemical nature of the dopant, the X-ray diffraction (XRD) analysiswas conducted. The pure 3DOM WO₃ has a monoclinic type crystal structure. And no additional peak is observed in Fe doped 3DOM WO₃, indicating that the incorporation of Fe atoms did not result in a secondary phase formation. In order to confirm the morphologies of Fe-3DOM WO₃and 3DOM WO₃, scanning electron microscopy (SEM) was employed. The synthesized Fe-3DOM WO₃and 3DOM WO₃ both exhibit a highly ordered three dimensional inverse opal structure with interconnected pores. From high-resolution TEM image of Fe-3DOM WO₃, the ordered lattice fringes with a spacing of 3.84 Å can be assigned to the (001) plane of WO₃, which is consistent with the XRD results. Finally, the photocatalytic nitrogen reduction performance of 3DOM WO₃ and Fe doped 3DOM WO₃with various Fe contents were examined. As a result, both Fe-3DOM WO₃ samples achieve higher ammonia production rate than that of pure 3DOM WO₃, indicating that the doped Fe plays a critical role in the photocatalytic nitrogen fixation performance. To verify the reaction process upon N2 reduction on the Fe-3DOM WO₃, in-situ diffuse reflectance infrared Fourier-transform spectroscopy was employed to monitor the intermediates. The in-situ DRIFTS spectra of Fe-3DOM WO₃ exhibit the increased signals with the irradiation time from 0–60min in the N2 atmosphere. The above results prove that nitrogen is gradually hydrogenated to produce ammonia over Fe-3DOM WO₃. Thiswork would enrich our knowledge in designing efficient photocatalystsfor photocatalytic nitrogen reduction.

Keywords: ammonia, photocatalytic, nitrogen fixation, Fe doped 3DOM WO₃

Procedia PDF Downloads 171
164 Combustion Variability and Uniqueness in Cylinders of a Radial Aircraft Piston Engine

Authors: Michal Geca, Grzegorz Baranski, Ksenia Siadkowska

Abstract:

The work is a part of the project which aims at developing innovative power and control systems for the high power aircraft piston engine ASz62IR. Developed electronically controlled ignition system will reduce emissions of toxic compounds as a result of lowered fuel consumption, optimized combustion and engine capability of efficient combustion of ecological fuels. The tested unit is an air-cooled four-stroke gasoline engine of 9 cylinders in a radial setup, mechanically charged by a radial compressor powered by the engine crankshaft. The total engine cubic capac-ity is 29.87 dm3, and the compression ratio is 6.4:1. The maximum take-off power is 1000 HP at 2200 rpm. The maximum fuel consumption is 280 kg/h. Engine powers aircrafts: An-2, M-18 „Dromader”, DHC-3 „OTTER”, DC-3 „Dakota”, GAF-125 „HAWK” i Y5. The main problems of the engine includes the imbalanced work of cylinders. The non-uniformity value in each cylinder results in non-uniformity of their work. In radial engine cylinders arrangement causes that the mixture movement that takes place in accordance (lower cylinder) or the opposite (upper cylinders) to the direction of gravity. Preliminary tests confirmed the presence of uneven workflow of individual cylinders. The phenomenon is most intense at low speed. The non-uniformity is visible on the waveform of cylinder pressure. Therefore two studies were conducted to determine the impact of this phenomenon on the engine performance: simulation and real tests. Simplified simulation was conducted on the element of the intake system coated with fuel film. The study shows that there is an effect of gravity on the movement of the fuel film inside the radial engine intake channels. Both in the lower and the upper inlet channels the film flows downwards. It follows from the fact that gravity assists the movement of the film in the lower cylinder channels and prevents the movement in the upper cylinder channels. Real tests on aircraft engine ASz62IR was conducted in transients condition (rapid change of the excess air in each cylinder were performed. Calculations were conducted for mass of fuel reaching the cylinders theoretically and really and on this basis, the factors of fuel evaporation “x” were determined. Therefore a simplified model of the fuel supply to cylinder was adopted. Model includes time constant of the fuel film τ, the number of engine transport cycles of non-evaporating fuel along the intake pipe γ and time between next cycles Δt. The calculation results of identification of the model parameters are presented in the form of radar graphs. The figures shows the averages declines and increases of the injection time and the average values for both types of stroke. These studies shown, that the change of the position of the cylinder will cause changes in the formation of fuel-air mixture and thus changes in the combustion process. Based on the results of the work of simulation and experiments was possible to develop individual algorithms for ignition control. This work has been financed by the Polish National Centre for Research and Development, INNOLOT, under Grant Agreement No. INNOLOT/I/1/NCBR/2013.

Keywords: radial engine, ignition system, non-uniformity, combustion process

Procedia PDF Downloads 366
163 Design of Nano-Reinforced Carbon Fiber Reinforced Plastic Wheel for Lightweight Vehicles with Integrated Electrical Hub Motor

Authors: Davide Cocchi, Andrea Zucchelli, Luca Raimondi, Maria Brugo Tommaso

Abstract:

The increasing attention is given to the issues of environmental pollution and climate change is exponentially stimulating the development of electrically propelled vehicles powered by renewable energy, in particular, the solar one. Given the small amount of solar energy that can be stored and subsequently transformed into propulsive energy, it is necessary to develop vehicles with high mechanical, electrical and aerodynamic efficiencies along with reduced masses. The reduction of the masses is of fundamental relevance especially for the unsprung masses, that is the assembly of those elements that do not undergo a variation of their distance from the ground (wheel, suspension system, hub, upright, braking system). Therefore, the reduction of unsprung masses is fundamental in decreasing the rolling inertia and improving the drivability, comfort, and performance of the vehicle. This principle applies even more in solar propelled vehicles, equipped with an electric motor that is connected directly to the wheel hub. In this solution, the electric motor is integrated inside the wheel. Since the electric motor is part of the unsprung masses, the development of compact and lightweight solutions is of fundamental importance. The purpose of this research is the design development and optimization of a CFRP 16 wheel hub motor for solar propulsion vehicles that can carry up to four people. In addition to trying to maximize aspects of primary importance such as mass, strength, and stiffness, other innovative constructive aspects were explored. One of the main objectives has been to achieve a high geometric packing in order to ensure a reduced lateral dimension, without reducing the power exerted by the electric motor. In the final solution, it was possible to realize a wheel hub motor assembly completely comprised inside the rim width, for a total lateral overall dimension of less than 100 mm. This result was achieved by developing an innovative connection system between the wheel and the rotor with a double purpose: centering and transmission of the driving torque. This solution with appropriate interlocking noses allows the transfer of high torques and at the same time guarantees both the centering and the necessary stiffness of the transmission system. Moreover, to avoid delamination in critical areas, evaluated by means of FEM analysis using 3D Hashin damage criteria, electrospun nanofibrous mats have been interleaved between CFRP critical layers. In order to reduce rolling resistance, the rim has been designed to withstand high inflation pressure. Laboratory tests have been performed on the rim using the Digital Image Correlation technique (DIC). The wheel has been tested for fatigue bending according to E/ECE/324 R124e.

Keywords: composite laminate, delamination, DIC, lightweight vehicle, motor hub wheel, nanofiber

Procedia PDF Downloads 214
162 Enhanced Multi-Scale Feature Extraction Using a DCNN by Proposing Dynamic Soft Margin SoftMax for Face Emotion Detection

Authors: Armin Nabaei, M. Omair Ahmad, M. N. S. Swamy

Abstract:

Many facial expression and emotion recognition methods in the traditional approaches of using LDA, PCA, and EBGM have been proposed. In recent years deep learning models have provided a unique platform addressing by automatically extracting the features for the detection of facial expression and emotions. However, deep networks require large training datasets to extract automatic features effectively. In this work, we propose an efficient emotion detection algorithm using face images when only small datasets are available for training. We design a deep network whose feature extraction capability is enhanced by utilizing several parallel modules between the input and output of the network, each focusing on the extraction of different types of coarse features with fined grained details to break the symmetry of produced information. In fact, we leverage long range dependencies, which is one of the main drawback of CNNs. We develop this work by introducing a Dynamic Soft-Margin SoftMax.The conventional SoftMax suffers from reaching to gold labels very soon, which take the model to over-fitting. Because it’s not able to determine adequately discriminant feature vectors for some variant class labels. We reduced the risk of over-fitting by using a dynamic shape of input tensor instead of static in SoftMax layer with specifying a desired Soft- Margin. In fact, it acts as a controller to how hard the model should work to push dissimilar embedding vectors apart. For the proposed Categorical Loss, by the objective of compacting the same class labels and separating different class labels in the normalized log domain.We select penalty for those predictions with high divergence from ground-truth labels.So, we shorten correct feature vectors and enlarge false prediction tensors, it means we assign more weights for those classes with conjunction to each other (namely, “hard labels to learn”). By doing this work, we constrain the model to generate more discriminate feature vectors for variant class labels. Finally, for the proposed optimizer, our focus is on solving weak convergence of Adam optimizer for a non-convex problem. Our noteworthy optimizer is working by an alternative updating gradient procedure with an exponential weighted moving average function for faster convergence and exploiting a weight decay method to help drastically reducing the learning rate near optima to reach the dominant local minimum. We demonstrate the superiority of our proposed work by surpassing the first rank of three widely used Facial Expression Recognition datasets with 93.30% on FER-2013, and 16% improvement compare to the first rank after 10 years, reaching to 90.73% on RAF-DB, and 100% k-fold average accuracy for CK+ dataset, and shown to provide a top performance to that provided by other networks, which require much larger training datasets.

Keywords: computer vision, facial expression recognition, machine learning, algorithms, depp learning, neural networks

Procedia PDF Downloads 74
161 Triple Case Phantom Tumor of Lungs

Authors: Angelis P. Barlampas

Abstract:

Introduction: The term phantom lung mass describes the ovoid collection of fluid within the interlobular fissure, which initially creates the impression of a mass. The problem of correct differential diagnosis is great, especially in plain radiography. A case is presented with three nodular pulmonary foci, the shape, location, and density of which, as well as the presence of chronic loculated pleural effusions, suggest the presence of multiple phantom tumors of the lung. Purpose: The aim of this paper is to draw the attention of non-experienced and non-specialized physicians to the existence of benign findings that mimic pathological conditions and vice versa. The careful study of a radiological examination and the comparison with previous exams or further control protect against quick wrong conclusions. Methods: A hospitalized patient underwent a non-contrast CT scan of the chest as part of the general control of her situation. Results: Computed tomography revealed pleural effusions, some of them loculated, increased cardiothoracic index, as well as the presence of three nodular foci, one in the left lung and two in the right with a maximum density of up to 18 Hounsfield units and a mean diameter of approximately five centimeters. Two of them are located in the characteristical anatomical position of the major interlobular fissure. The third one is located in the area of the right lower lobe’s posterior basal part, and it presents the same characteristics as the previous ones and is likely to be a loculated fluid collection, within an auxiliary interlobular fissure or a cyst, in the context of the patient's more general pleural entrapments and loculations. The differential diagnosis of nodular foci based on their imaging characteristics includes the following: a) rare metastatic foci with low density (liposarcoma, mucous tumors of the digestive or genital system, necrotic metastatic foci, metastatic renal cancer, etc.), b) necrotic multiple primary lung tumor locations (squamous epithelial cancer, etc. ), c) hamartomas of the lung, d) fibrotic tumors of the interlobular fissures, e) lipoid pneumonia, f) fluid concentrations within the interlobular fissures, g) lipoma of the lung, h) myelolipomas of the lung. Conclusions: The collection of fluid within the interlobular fissure of the lung can give the false impression of a lung mass, particularly on plain chest radiography. In the case of computed tomography, the ability to measure the density of a lesion, combined with the provided high anatomical details of the location and characteristics of the lesion, can lead relatively easily to the correct diagnosis. In cases of doubt or image artifacts, comparison with previous or subsequent examinations can resolve any disagreements, while in rare cases, intravenous contrast may be necessary.

Keywords: phantom mass, chest CT, pleural effusion, cancer

Procedia PDF Downloads 55
160 Story of Per-: The Radial Network of One Lithuanian Prefix

Authors: Samanta Kietytė

Abstract:

The object of this study is the verbal derivatives stemming from the Lithuanian prefix per-. The prefix under examination can be classified as prepositional, having descended from the preposition per, thereby sharing the same prototypical meaning – denoting movement OVER. These frequently co-occur within sentences (1). The aim of this paper is to conduct a semantic analysis of the prefix per- and to propose a possible radial network of its meanings. In essence, the aim is to identify the interrelationships existing between its meanings. 1) Jis peršoko per tvorą/ 3SG.NOM.M jump.PST.3 over fence.ACC.SG. /ʻHe jumped over the fenceʼ. The foundation of this work lies in the methodological and theoretical framework of cognitive linguistics. The prototypical meaning of prefixes consistently embodies spatial dimensions that can be described through image schemas. This entails the identification of the trajectory, the landmark, and the relation between them in the situation described by the prefixed verb. The meanings of linguistic units are not perceived as arbitrary, but rather, they are interconnected through semantic motivation. According to this perspective, a singular meaning within linguistic units is considered as prototypical, while additional meanings are descended (not necessarily directly) from it. For example, one of the per- meanings TRANSFER (2) is derived from the prototypical meaning OVER. 2) Prašau persiųsti vadovo laišką man./ Ask.PRS.1 forward.INF manager.GEN.SG email.ACC.SG 1.SG.DAT/ ʻPlease forward the manager‘s email to meʼ. Certain semantic relations are explained by the conceptual metaphor and metonymy theory. For instances, when prefixed verb has a meaning WIN (3) it is related to the prototypical meaning. In this case, the prefixed verb describes situations of winning in various ways. In the prototypical meaning, the trajector moves higher than the landmark, and winning is metaphorically perceived as being higher. 3) Sūnus peraugo tėvą./ Son.NOM.SG outgrow.PST.3 father.ACC.SG/ ʻThe son has outgrown the fatherʼ. The data utilized for this study was collected from the 2014 grammatically annotated text "Lithuanian Web (LithuanianWaC v2)", consisting of 63,645,700 words. Given that the corpus is grammatically lemmatized, the list of the 793 items was obtained using the wordlist function and specifying that verbs starting with per were searched. The list included not only prefixed verbs but also other verbs whose roots have the same letter sequences as prefixes. Also, words with misspellings, without diacritical marks, and words listed for lemmatization errors were rejected, and a total of 475 derivatives were left for further analysis. The semantic analysis revealed that there are 12 distinct meanings of the prefix per-. The spatial meanings were extracted by determining what a trajector is, what a landmark is, and what the relation between them is. The connection between non-spatial meanings and spatial ones occurs through semantic motivation established by identifying elements that correspond to the trajector and landmark. The analysis reveals that there are no strict boundaries among these meanings, instead showing a continuum that encompasses a central core and a peripheral association with their internal structure, i.e., some derivatives are more prototypical of a particular meaning than others.

Keywords: word-formation, cognitive semantics, metaphor, radial networks, prototype theory, prefix

Procedia PDF Downloads 77
159 Using Statistical Significance and Prediction to Test Long/Short Term Public Services and Patients' Cohorts: A Case Study in Scotland

Authors: Raptis Sotirios

Abstract:

Health and social care (HSc) services planning and scheduling are facing unprecedented challenges due to the pandemic pressure and also suffer from unplanned spending that is negatively impacted by the global financial crisis. Data-driven can help to improve policies, plan and design services provision schedules using algorithms assist healthcare managers’ to face unexpected demands using fewer resources. The paper discusses services packing using statistical significance tests and machine learning (ML) to evaluate demands similarity and coupling. This is achieved by predicting the range of the demand (class) using ML methods such as CART, random forests (RF), and logistic regression (LGR). The significance tests Chi-Squared test and Student test are used on data over a 39 years span for which HSc services data exist for services delivered in Scotland. The demands are probabilistically associated through statistical hypotheses that assume that the target service’s demands are statistically dependent on other demands as a NULL hypothesis. This linkage can be confirmed or not by the data. Complementarily, ML methods are used to linearly predict the above target demands from the statistically found associations and extend the linear dependence of the target’s demand to independent demands forming, thus groups of services. Statistical tests confirm ML couplings making the prediction also statistically meaningful and prove that a target service can be matched reliably to other services, and ML shows these indicated relationships can also be linear ones. Zero paddings were used for missing years records and illustrated better such relationships both for limited years and in the entire span offering long term data visualizations while limited years groups explained how well patients numbers can be related in short periods or can change over time as opposed to behaviors across more years. The prediction performance of the associations is measured using Receiver Operating Characteristic(ROC) AUC and ACC metrics as well as the statistical tests, Chi-Squared and Student. Co-plots and comparison tables for RF, CART, and LGR as well as p-values and Information Exchange(IE), are provided showing the specific behavior of the ML and of the statistical tests and the behavior using different learning ratios. The impact of k-NN and cross-correlation and C-Means first groupings is also studied over limited years and the entire span. It was found that CART was generally behind RF and LGR, but in some interesting cases, LGR reached an AUC=0 falling below CART, while the ACC was as high as 0.912, showing that ML methods can be confused padding or by data irregularities or outliers. On average, 3 linear predictors were sufficient, LGR was found competing RF well, and CART followed with the same performance at higher learning ratios. Services were packed only if when significance level(p-value) of their association coefficient was more than 0.05. Social factors relationships were observed between home care services and treatment of old people, birth weights, alcoholism, drug abuse, and emergency admissions. The work found that different HSc services can be well packed as plans of limited years, across various services sectors, learning configurations, as confirmed using statistical hypotheses.

Keywords: class, cohorts, data frames, grouping, prediction, prob-ability, services

Procedia PDF Downloads 234
158 The Active Social Live of #Lovewins: Understanding the Discourse of Homosexual Love and Rights in Thailand

Authors: Tinnaphop Sinsomboonthong

Abstract:

The hashtag, #LoveWins, has been widely used for celebrating the victory of the LGBTQ movement since June 2015 when the US Supreme Court enacted the rights of same-sex marriage. Nowadays, the hashtag is generally used among active social media users in many countries, including Thailand. Amidst the political conflict between advocates of the junta-backed legislation related to same-sex marriage laws, known as ‘Thailand’s Civil Partnership Draft Bills,’ and its detractors, the hashtag becomes crucial for Thailand’s 2019 national election season and shortly afterward as it was one of the most crucial parts of a political campaign to rebrand many political parties’ image, create an LGBT-friendly atmosphere and neutralize the bi-polarized politics of the law. The use of the hashtag is, therefore, not just an online entertainment but a politico-discursive tool, used by many actors for many purposes. Behind the confrontation between supporters and opposers of the law, the hashtag is used by both sides to highlight the Western-centric normativity of homosexual love, closely associated with Eurocentric modernity and heteronormativity. As an online ethnographical study, this paper aims to analyze how #LoveWins is used among Thai social media users in late 2018 to mid-2019 and how it is signified by Thai social media users during the Drafted-Bills period and the 2019 national election. A number of preliminary surveys of data on Twitter were conducted in December 2018 and, more intensely, in January 2019. Later, the data survey was officially conducted twice during February and April 2019, while the data collection was done during May-June 2019. Only public posts on Twitter that include the hashtag, #LoveWins, or any hashtags quoting ‘love’ and ‘wins’ are the main targets of this research. As a result of this, the use of the hashtag can be categorized into three levels, including banal decoration, homosexual love celebration, and colonial discourse on homosexual love. Particularly in the third type of the use of the hashtag, discourse analysis is applied to reveal that this hashtag is closely associated with the discourse of development and modernity as most of the descriptive posts demonstrate aspirations to become more ‘developed and modernized’ like many Western countries and Taiwan, the LGBT capital in Asia. Thus, calls for the ‘right to homosexual love’ and the ‘right to same-sex marriage’ in Thailand are shaped and formulated within the discursive linkage between modernity, development, and love. Also, the use of #LoveWins can be considered as a de-queering process of love as only particular types of gender identity, sexual orientation, and relationships that reflect Eurocentric modernity and heteronormativity are acceptable and advocated. Due to this, more inclusive queer loves should be supported rather than a mere essentialist-traditionalist homosexual love. Homonormativity must be deconstructed, and love must no longer be reserved for only one particular type of relationship that is standardized from/by the West. It must become more inclusive.

Keywords: #LoveWins, homosexual love, LGBT rights, same-sex marriage

Procedia PDF Downloads 139
157 A Comprehensive Survey of Artificial Intelligence and Machine Learning Approaches across Distinct Phases of Wildland Fire Management

Authors: Ursula Das, Manavjit Singh Dhindsa, Kshirasagar Naik, Marzia Zaman, Richard Purcell, Srinivas Sampalli, Abdul Mutakabbir, Chung-Horng Lung, Thambirajah Ravichandran

Abstract:

Wildland fires, also known as forest fires or wildfires, are exhibiting an alarming surge in frequency in recent times, further adding to its perennial global concern. Forest fires often lead to devastating consequences ranging from loss of healthy forest foliage and wildlife to substantial economic losses and the tragic loss of human lives. Despite the existence of substantial literature on the detection of active forest fires, numerous potential research avenues in forest fire management, such as preventative measures and ancillary effects of forest fires, remain largely underexplored. This paper undertakes a systematic review of these underexplored areas in forest fire research, meticulously categorizing them into distinct phases, namely pre-fire, during-fire, and post-fire stages. The pre-fire phase encompasses the assessment of fire risk, analysis of fuel properties, and other activities aimed at preventing or reducing the risk of forest fires. The during-fire phase includes activities aimed at reducing the impact of active forest fires, such as the detection and localization of active fires, optimization of wildfire suppression methods, and prediction of the behavior of active fires. The post-fire phase involves analyzing the impact of forest fires on various aspects, such as the extent of damage in forest areas, post-fire regeneration of forests, impact on wildlife, economic losses, and health impacts from byproducts produced during burning. A comprehensive understanding of the three stages is imperative for effective forest fire management and mitigation of the impact of forest fires on both ecological systems and human well-being. Artificial intelligence and machine learning (AI/ML) methods have garnered much attention in the cyber-physical systems domain in recent times leading to their adoption in decision-making in diverse applications including disaster management. This paper explores the current state of AI/ML applications for managing the activities in the aforementioned phases of forest fire. While conventional machine learning and deep learning methods have been extensively explored for the prevention, detection, and management of forest fires, a systematic classification of these methods into distinct AI research domains is conspicuously absent. This paper gives a comprehensive overview of the state of forest fire research across more recent and prominent AI/ML disciplines, including big data, classical machine learning, computer vision, explainable AI, generative AI, natural language processing, optimization algorithms, and time series forecasting. By providing a detailed overview of the potential areas of research and identifying the diverse ways AI/ML can be employed in forest fire research, this paper aims to serve as a roadmap for future investigations in this domain.

Keywords: artificial intelligence, computer vision, deep learning, during-fire activities, forest fire management, machine learning, pre-fire activities, post-fire activities

Procedia PDF Downloads 72
156 Integrating Multiple Types of Value in Natural Capital Accounting Systems: Environmental Value Functions

Authors: Pirta Palola, Richard Bailey, Lisa Wedding

Abstract:

Societies and economies worldwide fundamentally depend on natural capital. Alarmingly, natural capital assets are quickly depreciating, posing an existential challenge for humanity. The development of robust natural capital accounting systems is essential for transitioning towards sustainable economic systems and ensuring sound management of capital assets. However, the accurate, equitable and comprehensive estimation of natural capital asset stocks and their accounting values still faces multiple challenges. In particular, the representation of socio-cultural values held by groups or communities has arguably been limited, as to date, the valuation of natural capital assets has primarily been based on monetary valuation methods and assumptions of individual rationality. People relate to and value the natural environment in multiple ways, and no single valuation method can provide a sufficiently comprehensive image of the range of values associated with the environment. Indeed, calls have been made to improve the representation of multiple types of value (instrumental, intrinsic, and relational) and diverse ontological and epistemological perspectives in environmental valuation. This study addresses this need by establishing a novel valuation framework, Environmental Value Functions (EVF), that allows for the integration of multiple types of value in natural capital accounting systems. The EVF framework is based on the estimation and application of value functions, each of which describes the relationship between the value and quantity (or quality) of an ecosystem component of interest. In this framework, values are estimated in terms of change relative to the current level instead of calculating absolute values. Furthermore, EVF was developed to also support non-marginalist conceptualizations of value: it is likely that some environmental values cannot be conceptualized in terms of marginal changes. For example, ecological resilience value may, in some cases, be best understood as a binary: it either exists (1) or is lost (0). In such cases, a logistic value function may be used as the discriminator. Uncertainty in the value function parameterization can be considered through, for example, Monte Carlo sampling analysis. The use of EVF is illustrated with two conceptual examples. For the first time, EVF offers a clear framework and concrete methodology for the representation of multiple types of value in natural capital accounting systems, simultaneously enabling 1) the complementary use and integration of multiple valuation methods (monetary and non-monetary); 2) the synthesis of information from diverse knowledge systems; 3) the recognition of value incommensurability; 4) marginalist and non-marginalist value analysis. Furthermore, with this advancement, the coupling of EVF and ecosystem modeling can offer novel insights to the study of spatial-temporal dynamics in natural capital asset values. For example, value time series can be produced, allowing for the prediction and analysis of volatility, long-term trends, and temporal trade-offs. This approach can provide essential information to help guide the transition to a sustainable economy.

Keywords: economics of biodiversity, environmental valuation, natural capital, value function

Procedia PDF Downloads 194