Search results for: automated external defibrillator
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2962

Search results for: automated external defibrillator

2512 Effects of Different Climate Zones, Building Types, and Primary Fuel Sources for Energy Production on Environmental Damage from Four External Wall Technologies for Residential Buildings in Israel

Authors: Svetlana Pushkar, Oleg Verbitsky

Abstract:

The goal of the present study is to evaluate environmental damage from four wall technologies under the following conditions: four climate zones in Israel, two building (conventional vs. low-energy) types, and two types of fuel source [natural gas vs. photovoltaic (PV)]. The hierarchical ReCiPe method with a two-stage nested (hierarchical) ANOVA test is applied. It was revealed that in a hot climate in Israel in a conventional building fueled by natural gas, OE is dominant (90 %) over the P&C stage (10 %); in a mild climate in Israel in a low-energy building with PV, the P&C stage is dominant (85 %) over the OE stage (15 %). It is concluded that if PV is used in the building sector in Israel, (i) the P&C stage becomes a significant factor that influences the environment, (ii) autoclaved aerated block is the best external wall technology, and (iii) a two-stage nested mixed ANOVA can be used to evaluate environmental damage via ReCiPe when wall technologies are compared.

Keywords: life cycle assessment (LCA), photovoltaic, ReCiPe method, residential buildings

Procedia PDF Downloads 276
2511 Magneto-Transport of Single Molecular Transistor Using Anderson-Holstein-Caldeira-Leggett Model

Authors: Manasa Kalla, Narasimha Raju Chebrolu, Ashok Chatterjee

Abstract:

We have studied the quantum transport properties of a single molecular transistor in the presence of an external magnetic field using the Keldysh Green function technique. We also used the Anderson-Holstein-Caldeira-Leggett Model to describe the single molecular transistor that consists of a molecular quantum dot (QD) coupled to two metallic leads and placed on a substrate that acts as a heat bath. The phonons are eliminated by the Lang-Firsov transformation and the effective Hamiltonian is used to study the effect of an external magnetic field on the spectral density function, Tunneling Current, Differential Conductance and Spin polarization. A peak in the spectral function corresponds to a possible excitation. In the presence of a magnetic field, the spin-up and spin-down states are degenerate and this degeneracy is lifted by the magnetic field leading to the splitting of the central peak of the spectral function. The tunneling current decreases with increasing magnetic field. We have observed that even the differential conductance peak in the zero magnetic field curve is split in the presence electron-phonon interaction. As the magnetic field is increased, each peak splits into two peaks. And each peak indicates the existence of an energy level. Thus the number of energy levels for transport in the bias window increases with the magnetic field. In the presence of the electron-phonon interaction, Differential Conductance in general gets reduced and decreases faster with the magnetic field. As magnetic field strength increases, the spin polarization of the current is increasing. Our results show that a strongly interacting QD coupled to metallic leads in the presence of external magnetic field parallel to the plane of QD acts as a spin filter at zero temperature.

Keywords: Anderson-Holstein model, Caldeira-Leggett model, spin-polarization, quantum dots

Procedia PDF Downloads 162
2510 The Use of Fractional Brownian Motion in the Generation of Bed Topography for Bodies of Water Coupled with the Lattice Boltzmann Method

Authors: Elysia Barker, Jian Guo Zhou, Ling Qian, Steve Decent

Abstract:

A method of modelling topography used in the simulation of riverbeds is proposed in this paper, which removes the need for datapoints and measurements of physical terrain. While complex scans of the contours of a surface can be achieved with other methods, this requires specialised tools, which the proposed method overcomes by using fractional Brownian motion (FBM) as a basis to estimate the real surface within a 15% margin of error while attempting to optimise algorithmic efficiency. This removes the need for complex, expensive equipment and reduces resources spent modelling bed topography. This method also accounts for the change in topography over time due to erosion, sediment transport, and other external factors which could affect the topography of the ground by updating its parameters and generating a new bed. The lattice Boltzmann method (LBM) is used to simulate both stationary and steady flow cases in a side-by-side comparison over the generated bed topography using the proposed method and a test case taken from an external source. The method, if successful, will be incorporated into the current LBM program used in the testing phase, which will allow an automatic generation of topography for the given situation in future research, removing the need for bed data to be specified.

Keywords: bed topography, FBM, LBM, shallow water, simulations

Procedia PDF Downloads 77
2509 Framework for Detecting External Plagiarism from Monolingual Documents: Use of Shallow NLP and N-Gram Frequency Comparison

Authors: Saugata Bose, Ritambhra Korpal

Abstract:

The internet has increased the copy-paste scenarios amongst students as well as amongst researchers leading to different levels of plagiarized documents. For this reason, much of research is focused on for detecting plagiarism automatically. In this paper, an initiative is discussed where Natural Language Processing (NLP) techniques as well as supervised machine learning algorithms have been combined to detect plagiarized texts. Here, the major emphasis is on to construct a framework which detects external plagiarism from monolingual texts successfully. For successfully detecting the plagiarism, n-gram frequency comparison approach has been implemented to construct the model framework. The framework is based on 120 characteristics which have been extracted during pre-processing the documents using NLP approach. Afterwards, filter metrics has been applied to select most relevant characteristics and then supervised classification learning algorithm has been used to classify the documents in four levels of plagiarism. Confusion matrix was built to estimate the false positives and false negatives. Our plagiarism framework achieved a very high the accuracy score.

Keywords: lexical matching, shallow NLP, supervised machine learning algorithm, word n-gram

Procedia PDF Downloads 341
2508 Workplace Development Programmes for Small and Medium-Sized Enterprises in Europe and Singapore: A Conceptual Study

Authors: Zhan Jie How

Abstract:

With the heightened awareness of workplace learning and its impact on improving organizational performance and developing employee competence, governments and corporations around the world are forced to intensify their cooperation to establish national workplace development programmes to guide these corporations in fostering engaging and collaborative workplace learning cultures. This conceptual paper aims to conduct a comparative study of existing workplace development programmes for small and medium-sized enterprises (SMEs) in Europe and Singapore, focusing primarily on the Swedish Production Leap, Finnish TEKES Liideri Programme, and Singapore SkillsFuture SME Mentors Programme. The study carries out a systematic review of the three workplace development programmes to examine the roles of external mentors or coaches in influencing the design and implementation of workplace learning strategies and practices in SMEs. Organizational, personal and external factors that promote or inhibit effective workplace mentorship are also scrutinized, culminating in a critical comparison and evaluation of the strengths and weaknesses of the aforementioned programmes. Based on the findings from the review and analyses, a heuristic conceptual framework is developed to illustrate the complex interrelationships among external workplace development programmes, internal learning and development initiatives instituted by the organization’s higher management, and employees' continuous learning activities at the workplace. The framework also includes a set of guiding principles that can be used as the basis for internal mediation between the competing perspectives of mentors and mentees (employers and employees of the organization) regarding workplace learning conditions, practices and their intended impact on the organization. The conceptual study provides a theoretical blueprint for future empirical research on organizational workplace learning and the impact of government-initiated workplace development programmes.

Keywords: employee competence, mentorship, organizational performance, workplace development programme, workplace learning culture

Procedia PDF Downloads 127
2507 The Effects of External Daminozide (ALAR) Application on Nutrient Contents in Memecik Olive Trees

Authors: Sahriye Sonmez, Salih Ulger, Mustafa Kaplan, Mustafa Karhan

Abstract:

The objective of this study was to investigate the effects of external ALAR application on nutrients contents in leaf and node in ‘on (bearing)’ and ‘off (non-bearing)’ years in Memecik olive trees. For this purpose; 2000 mg L-1 ALAR was externally applied to Memecik olive trees, and leaf and node samples from olive trees were taken during the induction, initiation and differentiation periods in ‘on’ and ‘off’ years. Nutrients contents (N, P, K, Ca, Mg, Fe, Mn, Zn and Cu) in leaf and node samples were determined. The K, Ca, Mg, Fe, Mn, Zn and Cu contents were determined by atomic absorption spectrophotometry, Nitrogen by Kjeldahl procedure, and P by a spectrophotometric method. The results showed that the N, Ca, Mg, Fe, Mn, Zn and Cu contents in ‘on’ year were higher than ‘off’ year while the K contents in ‘on’ year were lower than ‘off ‘ year, but the P content was not different. The N, Ca, Mg, Fe and Mn contents in leaf samples were higher in the node samples except for K while the P, Zn and Cu contents were not different. The N, K, Ca, Fe, Mn, Zn and Cu contents were lowest during the initiation period while the P content was highest in this period. The Mg content was not different in all period.

Keywords: bearing, differentiation period, induction period, initiation period, non bearing, olive

Procedia PDF Downloads 434
2506 Cyber Security Enhancement via Software Defined Pseudo-Random Private IP Address Hopping

Authors: Andre Slonopas, Zona Kostic, Warren Thompson

Abstract:

Obfuscation is one of the most useful tools to prevent network compromise. Previous research focused on the obfuscation of the network communications between external-facing edge devices. This work proposes the use of two edge devices, external and internal facing, which communicate via private IPv4 addresses in a software-defined pseudo-random IP hopping. This methodology does not require additional IP addresses and/or resources to implement. Statistical analyses demonstrate that the hopping surface must be at least 1e3 IP addresses in size with a broad standard deviation to minimize the possibility of coincidence of monitored and communication IPs. The probability of breaking the hopping algorithm requires a collection of at least 1e6 samples, which for large hopping surfaces will take years to collect. The probability of dropped packets is controlled via memory buffers and the frequency of hops and can be reduced to levels acceptable for video streaming. This methodology provides an impenetrable layer of security ideal for information and supervisory control and data acquisition systems.

Keywords: moving target defense, cybersecurity, network security, hopping randomization, software defined network, network security theory

Procedia PDF Downloads 168
2505 Immature Palm Tree Detection Using Morphological Filter for Palm Counting with High Resolution Satellite Image

Authors: Nur Nadhirah Rusyda Rosnan, Nursuhaili Najwa Masrol, Nurul Fatiha MD Nor, Mohammad Zafrullah Mohammad Salim, Sim Choon Cheak

Abstract:

Accurate inventories of oil palm planted areas are crucial for plantation management as this would impact the overall economy and production of oil. One of the technological advancements in the oil palm industry is semi-automated palm counting, which is replacing conventional manual palm counting via digitizing aerial imagery. Most of the semi-automated palm counting method that has been developed was limited to mature palms due to their ideal canopy size represented by satellite image. Therefore, immature palms were often left out since the size of the canopy is barely visible from satellite images. In this paper, an approach using a morphological filter and high-resolution satellite image is proposed to detect immature palm trees. This approach makes it possible to count the number of immature oil palm trees. The method begins with an erosion filter with an appropriate window size of 3m onto the high-resolution satellite image. The eroded image was further segmented using watershed segmentation to delineate immature palm tree regions. Then, local minimum detection was used because it is hypothesized that immature oil palm trees are located at the local minimum within an oil palm field setting in a grayscale image. The detection points generated from the local minimum are displaced to the center of the immature oil palm region and thinned. Only one detection point is left that represents a tree. The performance of the proposed method was evaluated on three subsets with slopes ranging from 0 to 20° and different planting designs, i.e., straight and terrace. The proposed method was able to achieve up to more than 90% accuracy when compared with the ground truth, with an overall F-measure score of up to 0.91.

Keywords: immature palm count, oil palm, precision agriculture, remote sensing

Procedia PDF Downloads 57
2504 Marketing Strategy and Marketing Mix for Rural Tour Package in Bali: Case Study of Munduk

Authors: Made Darmiati, Ni Putu Evi Wijayanti, Ni Ketut Wiwiek Agustina, Putu Gde Arie Yudhistira, Marcel Hardono

Abstract:

The establishment of tourist village has been the main concern for pro-poor tourism in Indonesia especially in Bali in order to create alternative tourist destination. The case study of this research was Munduk, a tourist village located in Buleleng Regency, Bali Province. Munduk has been unstable in terms of tourist visit in 2012 until 2016. The concept of marketing strategy and its marketing mix are concepts that suitable for application in Munduk as the prime owner of trekking and other rural tour packages to increase the number of visitor in particularly during low season. The research study aims to determine the internal factors (strengths and weaknesses) and external factors (opportunities and threats) impacting the number of tourist visit so that they could formulate appropriate marketing strategy for Munduk Tourist Village. Data has been obtained by observation, interviews with stakeholders, questionnaire to 100 participants and documentation. In addition, this research study uses descriptive qualitative methods and techniques known as SWOT (Strengths, Weaknesses, Opportunities, Threats) analysis by internal factors and external factors impacting the level of tourist visit to Munduk Tourist Village in Buleleng Regency, Bali. The sampling was done by ‘accidental sampling technique’ to obtain the participants to analyse the results of the SWOT analysis. Further assessment of internal and external weights has resulted respectively (1.84 , 1.84) which are in the first quadrant of the diagram in which S-O (Strengths-Opportunities) Strategy. As the prime owner of the trekking and other rural tour packages in the village, Munduk should maximise its strengths and take other opportunities as possible to wrap and design trekking and other rural tour packages and then offer the package to travel agents in Bali.

Keywords: marketing mix, marketing strategy, rural tourism, SWOT matrix

Procedia PDF Downloads 264
2503 An Investigation of Influential Factors in Adopting the Cloud Computing in Saudi Arabia: An Application of Technology Acceptance Model

Authors: Shayem Saleh ALresheedi, Lu Song Feng, Abdulaziz Abdulwahab M. Fatani

Abstract:

Cloud computing is an emerging concept in the technological sphere. Its development enables many applications to avail information online and on demand. It is becoming an essential element for businesses due to its ability to diminish the costs of IT infrastructure and is being adopted in Saudi Arabia. However, there exist many factors that affect its adoption. Several researchers in the field have ignored the study of the TAM model for identifying the relevant factors and their impact for adopting of cloud computing. This study focuses on evaluating the acceptability of cloud computing and analyzing its impacting factors using Technology Acceptance Model (TAM) of technology adoption in Saudi Arabia. It suggests a model to examine the influential factors of the TAM model along with external factors of technical support in adapting the cloud computing. The proposed model has been tested through the use of multiple hypotheses based on calculation tools and collected data from customers through questionnaires. The findings of the study prove that the TAM model along with external factors can be applied in measuring the expected adoption of cloud computing. The study presents an investigation of influential factors and further recommendation in adopting cloud computing in Saudi Arabia.

Keywords: cloud computing, acceptability, adoption, determinants

Procedia PDF Downloads 174
2502 A Philosophical Investigation into African Conceptions of Personhood in the Fourth Industrial Revolution

Authors: Sanelisiwe Ndlovu

Abstract:

Cities have become testbeds for automation and experimenting with artificial intelligence (AI) in managing urban services and public spaces. Smart Cities and AI systems are changing most human experiences from health and education to personal relations. For instance, in healthcare, social robots are being implemented as tools to assist patients. Similarly, in education, social robots are being used as tutors or co-learners to promote cognitive and affective outcomes. With that general picture in mind, one can now ask a further question about Smart Cities and artificial agents and their moral standing in the African context of personhood. There has been a wealth of literature on the topic of personhood; however, there is an absence of literature on African personhood in highly automated environments. Personhood in African philosophy is defined by the role one can and should play in the community. However, in today’s technologically advanced world, a risk is that machines become more capable of accomplishing tasks that humans would otherwise do. Further, on many African communitarian accounts, personhood and moral standing are associated with active relationality with the community. However, in the Smart City, human closeness is gradually diminishing. For instance, humans already do engage and identify with robotic entities, sometimes even romantically. The primary aim of this study is to investigate how African conceptions of personhood and community interact in a highly automated environment such as Smart Cities. Accordingly, this study lies in presenting a rarely discussed African perspective that emphasizes the necessity and the importance of relationality in handling Smart Cities and AI ethically. Thus, the proposed approach can be seen as the sub-Saharan African contribution to personhood and the growing AI debates, which takes the reality of the interconnectedness of society seriously. And it will also open up new opportunities to tackle old problems and use existing resources to confront new problems in the Fourth Industrial Revolution.

Keywords: smart city, artificial intelligence, personhood, community

Procedia PDF Downloads 185
2501 A Case for Ethics Practice under the Revised ISO 14001:2015

Authors: Reuben Govender, M. L. Woermann

Abstract:

The ISO 14001 management system standard was first published in 1996. It is a voluntary standard adopted by both private and public sector organizations globally. Adoption of the ISO 14001 standard at the corporate level is done to help manage business impacts on the environment e.g. pollution control. The International Organization for Standardization (ISO) revised the standard in 2004 and recently in 2015. The current revision of the standard appears to adopt a communitarian-type philosophy. The inclusion of requirements to consider external 'interested party' needs and expectations implies this philosophy. Therefore, at operational level businesses implementing ISO 14001 will have to consider needs and expectations beyond local laws. Should these external needs and expectations be included in the scope of the environmental management system, they become requirements to be complied with in much the same way as compliance to laws. The authors assert that the recent changes to ISO 14001 introduce an ethical dimension to the standard. The authors assert that business ethics as a discipline now finds relevance in ISO 14001 via contemporary stakeholder theory and discourse ethics. Finally, the authors postulate implications of (not) addressing these requirements before July 2018 when transition to the revised standard must be complete globally.

Keywords: business ethics, environmental ethics, ethics practice, ISO 14001:2015

Procedia PDF Downloads 241
2500 21st Century Business Dynamics: Acting Local and Thinking Global through Extensive Business Reporting Language (XBRL)

Authors: Samuel Faboyede, Obiamaka Nwobu, Samuel Fakile, Dickson Mukoro

Abstract:

In the present dynamic business environment of corporate governance and regulations, financial reporting is an inevitable and extremely significant process for every business enterprise. Several financial elements such as Annual Reports, Quarterly Reports, ad-hoc filing, and other statutory/regulatory reports provide vital information to the investors and regulators, and establish trust and rapport between the internal and external stakeholders of an organization. Investors today are very demanding, and emphasize greatly on authenticity, accuracy, and reliability of financial data. For many companies, the Internet plays a key role in communicating business information, internally to management and externally to stakeholders. Despite high prominence being attached to external reporting, it is disconnected in most companies, who generate their external financial documents manually, resulting in high degree of errors and prolonged cycle times. Chief Executive Officers and Chief Financial Officers are increasingly susceptible to endorsing error-laden reports, late filing of reports, and non-compliance with regulatory acts. There is a lack of common platform to manage the sensitive information – internally and externally – in financial reports. The Internet financial reporting language known as eXtensible Business Reporting Language (XBRL) continues to develop in the face of challenges and has now reached the point where much of its promised benefits are available. This paper looks at the emergence of this revolutionary twenty-first century language of digital reporting. It posits that today, the world is on the brink of an Internet revolution that will redefine the ‘business reporting’ paradigm. The new Internet technology, eXtensible Business Reporting Language (XBRL), is already being deployed and used across the world. It finds that XBRL is an eXtensible Markup Language (XML) based information format that places self-describing tags around discrete pieces of business information. Once tags are assigned, it is possible to extract only desired information, rather than having to download or print an entire document. XBRL is platform-independent and it will work on any current or recent-year operating system, or any computer and interface with virtually any software. The paper concludes that corporate stakeholders and the government cannot afford to ignore the XBRL. It therefore recommends that all must act locally and think globally now via the adoption of XBRL that is changing the face of worldwide business reporting.

Keywords: XBRL, financial reporting, internet, internal and external reports

Procedia PDF Downloads 263
2499 Experimental Study Analysis of Flow over Pickup Truck’s Cargo Area Using Bed Covers

Authors: Jonathan Rodriguez, Dominga Guerrero, Surupa Shaw

Abstract:

Automobiles are modeled in various forms, and they interact with air when in motion. Aerodynamics is the study of such interactions where solid bodies affect the way air moves around them. The shape of solid bodies can impact the ease at which they move against the flow of air; due to which any additional freightage, or loads, impact its aerodynamics. It is important to transport people and cargo safely. Despite the various safety measures, there are a large number of vehicle-related accidents. This study precisely explores the effects an automobile experiences, with added cargo and covers. The addition of these items changes the original vehicle shape and the approved design for safe driving. This paper showcases the effects of the changed vehicle shape and design via experimental testing conducted on a physical 1:27 scale and CAD model of an F-150 pickup truck, the most common pickup truck in the United States, with differently shaped loads and weight traveling at a constant speed. The additional freightage produces unwanted drag or lift resulting in lower fuel efficiencies and unsafe driving conditions. This study employs an adjustable external shell on the F-150 pickup truck to create a controlled aerodynamic geometry to combat the detrimental effects of additional freightage. The results utilize colored powder [ which acts as a visual medium for the interaction of air with the vehicle], to highlight the impact of the additional freight on the automobile’s external shell. This will be done along with simulation models using Altair CFD software of twelve cases regarding the effects of an added load onto an F-150 pickup truck. This paper is an attempt toward standardizing the geometric design of the external shell, given the uniqueness of every load and its placement on the vehicle; while providing real-time data to be compared to simulation results from the existing literature.

Keywords: aerodynamics, CFD, freightage, pickup cover

Procedia PDF Downloads 142
2498 Effect of Damping on Performance of Magnetostrictive Vibration Energy Harvester

Authors: Mojtaba Ghodsi, Hamidreza Ziaifar, Morteza Mohammadzaheri, Payam Soltani

Abstract:

This article presents an analytical model to estimate the harvested power from a Magnetostrictive cantilevered beam with tip excitation. Furthermore, the effects of internal and external damping on harvested power are investigated. The magnetostrictive material in this harvester is Galfenol. In comparison to other popular smart materials like Terfenol-D, Galfenol has higher strength and machinability. In this article, first, a mechanical model of the Euler-Bernoulli beam is employed to calculate the deflection of the harvester. Then, the magneto-mechanical equation of Galfenol is combined with Faraday's law to calculate the generated voltage of the Magnetostrictive cantilevered beam harvester. Finally, the beam model is incorporated in the aforementioned combination. The results show that a 30×8.5×1 mm Galfenol cantilever beam harvester with 80 turn pickup coil can generate up to 3.7 mV and 9 mW. Furthermore, sensitivity analysis made by Response Surface Method (RSM) shows that the harvested power is only sensitive to the internal damping coefficient.

Keywords: internal damping coefficient, external damping coefficient, euler-bernoulli, energy harvester, galfenol, magnetostrictive, response surface method

Procedia PDF Downloads 100
2497 Human-Machine Cooperation in Facial Comparison Based on Likelihood Scores

Authors: Lanchi Xie, Zhihui Li, Zhigang Li, Guiqiang Wang, Lei Xu, Yuwen Yan

Abstract:

Image-based facial features can be classified into category recognition features and individual recognition features. Current automated face recognition systems extract a specific feature vector of different dimensions from a facial image according to their pre-trained neural network. However, to improve the efficiency of parameter calculation, an algorithm generally reduces the image details by pooling. The operation will overlook the details concerned much by forensic experts. In our experiment, we adopted a variety of face recognition algorithms based on deep learning, compared a large number of naturally collected face images with the known data of the same person's frontal ID photos. Downscaling and manual handling were performed on the testing images. The results supported that the facial recognition algorithms based on deep learning detected structural and morphological information and rarely focused on specific markers such as stains and moles. Overall performance, distribution of genuine scores and impostor scores, and likelihood ratios were tested to evaluate the accuracy of biometric systems and forensic experts. Experiments showed that the biometric systems were skilled in distinguishing category features, and forensic experts were better at discovering the individual features of human faces. In the proposed approach, a fusion was performed at the score level. At the specified false accept rate, the framework achieved a lower false reject rate. This paper contributes to improving the interpretability of the objective method of facial comparison and provides a novel method for human-machine collaboration in this field.

Keywords: likelihood ratio, automated facial recognition, facial comparison, biometrics

Procedia PDF Downloads 113
2496 Monitoring and Evaluation in Community-Based Tourism: An Analysis and Model

Authors: Ivan Gunass Govender, Andrea Giampiccoli

Abstract:

A developmental state should use community engagement to facilitate socio-economic development for disadvantaged groups and individual members of society through empowerment, social justice, sustainability, and self-reliance. In this regard, community-based tourism (CBT) as a growing market should be an indigenous effort aided by external facilitation. Since this form of tourism presents its own preconditions, characteristics, and challenges, it could be guided by higher education institutions engagement. In particular, the facilitation should not only serve to assist the community members to reach their own goals; but rather also focus on learning through knowledge creation and sharing with the engagement of higher education institutions. While the increased relevance of CBT has produced various CBT manuals (or handbooks/guidelines) documents aimed to ‘teach’ and assist various entities in CBT development, this research aims to analyse the current monitoring & evaluation (M&E) manuals and thereafter, propose an M&E model for CBT. It is important to mention that all too often effective monitoring is seldom carried out thus risking the long-term sustainability and improvement of the CBT ventures. Therefore, the proposed model will also consider some inputs external to the tourism field, but in relation to local economic development (LED) matters from the previously proposed development monitoring and evaluation system framework. M&E should be seen as fundamental components of any CBT initiative, and the whole CBT intervention should be evaluated. In this context, M&E in CBT should go beyond strict ‘numerical’ economic matters and should be understood in a holistic development. In addition, M&E in CBT should not consider issues in various ‘compartments’ such as tourists, tourism attractions, CBT owners/participants, and stakeholder engagement but as interdependent components of a macro-ecosystem. Finally, the external facilitation process should be structured in a way to promote community self-reliance in both the intervention and the M&E process. The research will attempt to propose an M&E model for CBT so as to enhance the CBT possibilities of long-term growth and success through effective collaborations with key stakeholders.

Keywords: community-based tourism, community-engagement, monitoring and evaluation, stakeholders

Procedia PDF Downloads 281
2495 Advantages of Neural Network Based Air Data Estimation for Unmanned Aerial Vehicles

Authors: Angelo Lerro, Manuela Battipede, Piero Gili, Alberto Brandl

Abstract:

Redundancy requirements for UAV (Unmanned Aerial Vehicle) are hardly faced due to the generally restricted amount of available space and allowable weight for the aircraft systems, limiting their exploitation. Essential equipment as the Air Data, Attitude and Heading Reference Systems (ADAHRS) require several external probes to measure significant data as the Angle of Attack or the Sideslip Angle. Previous research focused on the analysis of a patented technology named Smart-ADAHRS (Smart Air Data, Attitude and Heading Reference System) as an alternative method to obtain reliable and accurate estimates of the aerodynamic angles. This solution is based on an innovative sensor fusion algorithm implementing soft computing techniques and it allows to obtain a simplified inertial and air data system reducing external devices. In fact, only one external source of dynamic and static pressures is needed. This paper focuses on the benefits which would be gained by the implementation of this system in UAV applications. A simplification of the entire ADAHRS architecture will bring to reduce the overall cost together with improved safety performance. Smart-ADAHRS has currently reached Technology Readiness Level (TRL) 6. Real flight tests took place on ultralight aircraft equipped with a suitable Flight Test Instrumentation (FTI). The output of the algorithm using the flight test measurements demonstrates the capability for this fusion algorithm to embed in a single device multiple physical and virtual sensors. Any source of dynamic and static pressure can be integrated with this system gaining a significant improvement in terms of versatility.

Keywords: aerodynamic angles, air data system, flight test, neural network, unmanned aerial vehicle, virtual sensor

Procedia PDF Downloads 205
2494 The Mental Workload of ICU Nurses in Performing Human-Machine Tasks: A Cross-sectional Survey

Authors: Yan Yan, Erhong Sun, Lin Peng, Xuchun Ye

Abstract:

Aims: The present study aimed to explore Intensive Care Unit(ICU) nurses’ mental workload (MWL) and associated factors with it in performing human-machine tasks. Background: A wide range of emerging technologies have penetrated widely in the field of health care, and ICU nurses are facing a dramatic increase in nursing human-machine tasks. However, there is still a paucity of literature reporting on the general MWL of ICU nurses performing human-machine tasks and the associated influencing factors. Methods: A cross-sectional survey was employed. The data was collected from January to February 2021 from 9 tertiary hospitals in 6 provinces (Shanghai, Gansu, Guangdong, Liaoning, Shandong, and Hubei). Two-stage sampling was used to recruit eligible ICU nurses (n=427). The data were collected with an electronic questionnaire comprising sociodemographic characteristics and the measures of MWL, self-efficacy, system usability, and task difficulty. The univariate analysis, two-way analysis of variance(ANOVA), and a linear mixed model were used for data analysis. Results: Overall, the mental workload of ICU nurses in performing human-machine tasks was medium (score 52.04 on a 0-100 scale). Among the typical nursing human-machine tasks selected, the MWL of ICU nurses in completing first aid and life support tasks (‘Using a defibrillator to defibrillate’ and ‘Use of ventilator’) was significantly higher than others (p < .001). And ICU nurses’ MWL in performing human-machine tasks was also associated with age (p = .001), professional title (p = .002), years of working in ICU (p < .001), willingness to study emerging technology actively (p = .006), task difficulty (p < .001), and system usability (p < .001). Conclusion: The MWL of ICU nurses is at a moderate level in the context of a rapid increase in nursing human-machine tasks. However, there are significant differences in MWL when performing different types of human-machine tasks, and MWL can be influenced by a combination of factors. Nursing managers need to develop intervention strategies in multiple ways. Implications for practice: Multidimensional approaches are required to perform human-machine tasks better, including enhancing nurses' willingness to learn emerging technologies actively, developing training strategies that vary with tasks, and identifying obstacles in the process of human-machine system interaction.

Keywords: mental workload(MWL), nurse, ICU, human-machine, tasks, cross-sectional study, linear mixed model, China

Procedia PDF Downloads 82
2493 Earnings Management from Taiwan Gisa Firms

Authors: An-an Chiu, Shaio Yan Huang, Ling-Na Chen, Wei-Hua Lin

Abstract:

Research has primarily focused on listed companies, less is done regarding small and medium-sized enterprises. Under the authorities' support, Taipei Exchange (TPEx) started Go Incubation Board for Startup and Acceleration Firms (GISA) in January 2014. This platform is designed to help small-sized innovative companies grow and to enter the capital market in the future. This research yield insight into earnings management activities around seasoned equity offerings (SEO) based on Taiwan’s GISA firms and the effectiveness of external corporate governance. Data for the study come from the GISA Market Observation Post System from January 2014 to December 2016. The result finds that GISA firms prone to upward accrual-based earnings management during SEO to avoid long-term negative consequences. Especially, firms with paid-in capital more than NT$ 30 million, higher fundraising amounts, or smaller-sized firms, tend to increase discretionary accruals. Finally, consistent with prior literature, CPA firms effectively serve as the role of external corporate governances on mitigating earnings management.

Keywords: GISA, earnings management, CPA, seasoned equity offerings

Procedia PDF Downloads 119
2492 Phenomena-Based Approach for Automated Generation of Process Options and Process Models

Authors: Parminder Kaur Heer, Alexei Lapkin

Abstract:

Due to global challenges of increased competition and demand for more sustainable products/processes, there is a rising pressure on the industry to develop innovative processes. Through Process Intensification (PI) the existing and new processes may be able to attain higher efficiency. However, very few PI options are generally considered. This is because processes are typically analysed at a unit operation level, thus limiting the search space for potential process options. PI performed at more detailed levels of a process can increase the size of the search space. The different levels at which PI can be achieved is unit operations, functional and phenomena level. Physical/chemical phenomena form the lowest level of aggregation and thus, are expected to give the highest impact because all the intensification options can be described by their enhancement. The objective of the current work is thus, generation of numerous process alternatives based on phenomena, and development of their corresponding computer aided models. The methodology comprises: a) automated generation of process options, and b) automated generation of process models. The process under investigation is disintegrated into functions viz. reaction, separation etc., and these functions are further broken down into the phenomena required to perform them. E.g., separation may be performed via vapour-liquid or liquid-liquid equilibrium. A list of phenomena for the process is formed and new phenomena, which can overcome the difficulties/drawbacks of the current process or can enhance the effectiveness of the process, are added to the list. For instance, catalyst separation issue can be handled by using solid catalysts; the corresponding phenomena are identified and added. The phenomena are then combined to generate all possible combinations. However, not all combinations make sense and, hence, screening is carried out to discard the combinations that are meaningless. For example, phase change phenomena need the co-presence of the energy transfer phenomena. Feasible combinations of phenomena are then assigned to the functions they execute. A combination may accomplish a single or multiple functions, i.e. it might perform reaction or reaction with separation. The combinations are then allotted to the functions needed for the process. This creates a series of options for carrying out each function. Combination of these options for different functions in the process leads to the generation of superstructure of process options. These process options, which are formed by a list of phenomena for each function, are passed to the model generation algorithm in the form of binaries (1, 0). The algorithm gathers the active phenomena and couples them to generate the model. A series of models is generated for the functions, which are combined to get the process model. The most promising process options are then chosen subjected to a performance criterion, for example purity of product, or via a multi-objective Pareto optimisation. The methodology was applied to a two-step process and the best route was determined based on the higher product yield. The current methodology can identify, produce and evaluate process intensification options from which the optimal process can be determined. It can be applied to any chemical/biochemical process because of its generic nature.

Keywords: Phenomena, Process intensification, Process models , Process options

Procedia PDF Downloads 219
2491 Analyzing the Commercialization of New Technology

Authors: Wen-Hsiang Lai, Mei-Wen Chen

Abstract:

In the face of developing new technologies, identifying potential new technological product and the suitable market is important. Since laser technology is widely applied in many industries, this study explores the technology commercialization of laser technology. According to the literature review and industry analysis, this study discusses the factors influencing the consumer’s purchase intention and tries to find a new market direction to develop the laser technology. This study adopts a new product adoption model as the research framework and uses three variables of ‘Consumer characteristics’, ‘Perception of product attributes’ and ‘External environment’ to discuss the purchase intention of consumers, who are physicians and owners of the medical cosmetics. This study finds that in the major variable of ‘Consumer characteristics’, the sub-variables of ‘Personality’, ‘Knowledge of product’, ‘Perceived risk’ and ‘Motivation’ are significantly related to consumer’s purchase intention. In the major variable of ‘Perception of product attributes’, the sub-variables of ‘Brand’ and ‘Measure of manufacture country’ are the key factors that affect the willingness of consumer’s purchase intention. Finally, in the major variable of ‘External environment’ variable, the sub-variables of ‘Time’ and ‘Price’ have significant impact on consumer’s purchase intention.

Keywords: technology commercialization, new product adoption, consumer’s purchase intention, laser technology

Procedia PDF Downloads 176
2490 Optimized Electron Diffraction Detection and Data Acquisition in Diffraction Tomography: A Complete Solution by Gatan

Authors: Saleh Gorji, Sahil Gulati, Ana Pakzad

Abstract:

Continuous electron diffraction tomography, also known as microcrystal electron diffraction (MicroED) or three-dimensional electron diffraction (3DED), is a powerful technique, which in combination with cryo-electron microscopy (cryo-ED), can provide atomic-scale 3D information about the crystal structure and composition of different classes of crystalline materials such as proteins, peptides, and small molecules. Unlike the well-established X-ray crystallography method, 3DED does not require large single crystals and can collect accurate electron diffraction data from crystals as small as 50 – 100 nm. This is a critical advantage as growing larger crystals, as required by X-ray crystallography methods, is often very difficult, time-consuming, and expensive. In most cases, specimens studied via 3DED method are electron beam sensitive, which means there is a limitation on the maximum amount of electron dose one can use to collect the required data for a high-resolution structure determination. Therefore, collecting data using a conventional scintillator-based fiber coupled camera brings additional challenges. This is because of the inherent noise introduced during the electron-to-photon conversion in the scintillator and transfer of light via the fibers to the sensor, which results in a poor signal-to-noise ratio and requires a relatively higher and commonly specimen-damaging electron dose rates, especially for protein crystals. As in other cryo-EM techniques, damage to the specimen can be mitigated if a direct detection camera is used which provides a high signal-to-noise ratio at low electron doses. In this work, we have used two classes of such detectors from Gatan, namely the K3® camera (a monolithic active pixel sensor) and Stela™ (that utilizes DECTRIS hybrid-pixel technology), to address this problem. The K3 is an electron counting detector optimized for low-dose applications (like structural biology cryo-EM), and Stela is also a counting electron detector but optimized for diffraction applications with high speed and high dynamic range. Lastly, data collection workflows, including crystal screening, microscope optics setup (for imaging and diffraction), stage height adjustment at each crystal position, and tomogram acquisition, can be one of the other challenges of the 3DED technique. Traditionally this has been all done manually or in a partly automated fashion using open-source software and scripting, requiring long hours on the microscope (extra cost) and extensive user interaction with the system. We have recently introduced Latitude® D in DigitalMicrograph® software, which is compatible with all pre- and post-energy-filter Gatan cameras and enables 3DED data acquisition in an automated and optimized fashion. Higher quality 3DED data enables structure determination with higher confidence, while automated workflows allow these to be completed considerably faster than before. Using multiple examples, this work will demonstrate how to direct detection electron counting cameras enhance 3DED results (3 to better than 1 Angstrom) for protein and small molecule structure determination. We will also show how Latitude D software facilitates collecting such data in an integrated and fully automated user interface.

Keywords: continuous electron diffraction tomography, direct detection, diffraction, Latitude D, Digitalmicrograph, proteins, small molecules

Procedia PDF Downloads 84
2489 Second Order Cone Optimization Approach to Two-stage Network DEA

Authors: K. Asanimoghadam, M. Salahi, A. Jamalian

Abstract:

Data envelopment analysis is an approach to measure the efficiency of decision making units with multiple inputs and outputs. The structure of many decision making units also has decision-making subunits that are not considered in most data envelopment analysis models. Also, the inputs and outputs of the decision-making units usually are considered desirable, while in some real-world problems, the nature of some inputs or outputs are undesirable. In this thesis, we study the evaluation of the efficiency of two stage decision-making units, where some outputs are undesirable using two non-radial models, the SBM and the ASBM models. We formulate the nonlinear ASBM model as a second order cone optimization problem. Finally, we compare two models for both external and internal evaluation approaches for two real world example in the presence of undesirable outputs. The results show that, in both external and internal evaluations, the overall efficiency of ASBM model is greater than or equal to the overall efficiency value of the SBM model, and in internal evaluation, the ASBM model is more flexible than the SBM model.

Keywords: network DEA, conic optimization, undesirable output, SBM

Procedia PDF Downloads 179
2488 Eosinopenia: Marker for Early Diagnosis of Enteric Fever

Authors: Swati Kapoor, Rajeev Upreti, Monica Mahajan, Abhaya Indrayan, Dinesh Srivastava

Abstract:

Enteric Fever is caused by gram negative bacilli Salmonella typhi and paratyphi. It is associated with high morbidity and mortality worldwide. Timely initiation of treatment is a crucial step for prevention of any complications. Cultures of body fluids are diagnostic, but not always conclusive or practically feasible in most centers. Moreover, the results of cultures delay the treatment initiation. Serological tests lack diagnostic value. The blood counts can offer a promising option in diagnosis. A retrospective study to find out the relevance of leucopenia and eosinopenia was conducted on 203 culture proven enteric fever patients and 159 culture proven non-enteric fever patients in a tertiary care hospital in New Delhi. The patient details were retrieved from the electronic medical records section of the hospital. Absolute eosinopenia was considered as absolute eosinophil count (AEC) of less than 40/mm³ (normal level: 40-400/mm³) using LH-750 Beckman Coulter Automated machine. Leucopoenia was defined as total leucocyte count (TLC) of less than 4 X 10⁹/l. Blood cultures were done using BacT/ALERT FA plus automated blood culture system before first antibiotic dose was given. Case and control groups were compared using Pearson Chi square test. It was observed that absolute eosinophil count (AEC) of 0-19/mm³ was a significant finding (p < 0.001) in enteric fever patients, whereas leucopenia was not a significant finding (p=0.096). Using Receiving Operating Characteristic (ROC) curves, it was observed that patients with both AEC < 14/mm³ and TCL < 8 x 10⁹/l had 95.6% chance of being diagnosed as enteric fever and only 4.4% chance of being diagnosed as non-enteric fever. This result was highly significant with p < 0.001. This is a very useful association of AEC and TLC found in enteric fever patients of this study which can be used for the early initiation of treatment in clinically suspected enteric fever patients.

Keywords: absolute eosinopenia, absolute eosinophil count, enteric fever, leucopenia, total leucocyte count

Procedia PDF Downloads 160
2487 Resale Housing Development Board Price Prediction Considering Covid-19 through Sentiment Analysis

Authors: Srinaath Anbu Durai, Wang Zhaoxia

Abstract:

Twitter sentiment has been used as a predictor to predict price values or trends in both the stock market and housing market. The pioneering works in this stream of research drew upon works in behavioural economics to show that sentiment or emotions impact economic decisions. Latest works in this stream focus on the algorithm used as opposed to the data used. A literature review of works in this stream through the lens of data used shows that there is a paucity of work that considers the impact of sentiments caused due to an external factor on either the stock or the housing market. This is despite an abundance of works in behavioural economics that show that sentiment or emotions caused due to an external factor impact economic decisions. To address this gap, this research studies the impact of Twitter sentiment pertaining to the Covid-19 pandemic on resale Housing Development Board (HDB) apartment prices in Singapore. It leverages SNSCRAPE to collect tweets pertaining to Covid-19 for sentiment analysis, lexicon based tools VADER and TextBlob are used for sentiment analysis, Granger Causality is used to examine the relationship between Covid-19 cases and the sentiment score, and neural networks are leveraged as prediction models. Twitter sentiment pertaining to Covid-19 as a predictor of HDB price in Singapore is studied in comparison with the traditional predictors of housing prices i.e., the structural and neighbourhood characteristics. The results indicate that using Twitter sentiment pertaining to Covid19 leads to better prediction than using only the traditional predictors and performs better as a predictor compared to two of the traditional predictors. Hence, Twitter sentiment pertaining to an external factor should be considered as important as traditional predictors. This paper demonstrates the real world economic applications of sentiment analysis of Twitter data.

Keywords: sentiment analysis, Covid-19, housing price prediction, tweets, social media, Singapore HDB, behavioral economics, neural networks

Procedia PDF Downloads 91
2486 An Automated Magnetic Dispersive Solid-Phase Extraction Method for Detection of Cocaine in Human Urine

Authors: Feiyu Yang, Chunfang Ni, Rong Wang, Yun Zou, Wenbin Liu, Chenggong Zhang, Fenjin Sun, Chun Wang

Abstract:

Cocaine is the most frequently used illegal drug globally, with the global annual prevalence of cocaine used ranging from 0.3% to 0.4 % of the adult population aged 15–64 years. Growing consumption trend of abused cocaine and drug crimes are a great concern, therefore urine sample testing has become an important noninvasive sampling whereas cocaine and its metabolites (COCs) are usually present in high concentrations and relatively long detection windows. However, direct analysis of urine samples is not feasible because urine complex medium often causes low sensitivity and selectivity of the determination. On the other hand, presence of low doses of analytes in urine makes an extraction and pretreatment step important before determination. Especially, in gathered taking drug cases, the pretreatment step becomes more tedious and time-consuming. So developing a sensitive, rapid and high-throughput method for detection of COCs in human body is indispensable for law enforcement officers, treatment specialists and health officials. In this work, a new automated magnetic dispersive solid-phase extraction (MDSPE) sampling method followed by high performance liquid chromatography-mass spectrometry (HPLC-MS) was developed for quantitative enrichment of COCs from human urine, using prepared magnetic nanoparticles as absorbants. The nanoparticles were prepared by silanizing magnetic Fe3O4 nanoparticles and modifying them with divinyl benzene and vinyl pyrrolidone, which possesses the ability for specific adsorption of COCs. And this kind of magnetic particle facilitated the pretreatment steps by electromagnetically controlled extraction to achieve full automation. The proposed device significantly improved the sampling preparation efficiency with 32 samples in one batch within 40mins. Optimization of the preparation procedure for the magnetic nanoparticles was explored and the performances of magnetic nanoparticles were characterized by scanning electron microscopy, vibrating sample magnetometer and infrared spectra measurements. Several analytical experimental parameters were studied, including amount of particles, adsorption time, elution solvent, extraction and desorption kinetics, and the verification of the proposed method was accomplished. The limits of detection for the cocaine and cocaine metabolites were 0.09-1.1 ng·mL-1 with recoveries ranging from 75.1 to 105.7%. Compared to traditional sampling method, this method is time-saving and environmentally friendly. It was confirmed that the proposed automated method was a kind of highly effective way for the trace cocaine and cocaine metabolites analyses in human urine.

Keywords: automatic magnetic dispersive solid-phase extraction, cocaine detection, magnetic nanoparticles, urine sample testing

Procedia PDF Downloads 185
2485 Enhancing Strategic Counter-Terrorism: Understanding How Familial Leadership Influences the Resilience of Terrorist and Insurgent Organizations in Asia

Authors: Andrew D. Henshaw

Abstract:

The research examines the influence of familial and kinship based leadership on the resilience of politically violent organizations. Organizations of this type frequently fight in the same conflicts though are called 'terrorist' or 'insurgent' depending on political foci of the time, and thus different approaches are used to combat them. The research considers them correlated phenomena with significant overlap and identifies strengths and vulnerabilities in resilience processes. The research employs paired case studies to examine resilience in organizations under significant external pressure, and achieves this by measuring three variables. 1: Organizational robustness in terms of leadership and governance. 2. Bounce-back response efficiency to external pressures and adaptation to endogenous and exogenous shock. 3. Perpetuity of operational and attack capability, and political legitimacy. The research makes three hypotheses. First, familial/kinship leadership groups have a significant effect on organizational resilience in terms of informal operations. Second, non-familial/kinship organizations suffer in terms of heightened security transaction costs and social economics surrounding recruitment, retention, and replacement. Third, resilience in non-familial organizations likely stems from critical external supports like state sponsorship or powerful patrons, rather than organic resilience dynamics. The case studies pair familial organizations with non-familial organizations. Set 1: The Haqqani Network (HQN) - Pair: Lashkar-e-Toiba (LeT). Set 2: Jemaah Islamiyah (JI) - Pair: The Abu Sayyaf Group (ASG). Case studies were selected based on three requirements, being: contrasting governance types, exposure to significant external pressures and, geographical similarity. The case study sets were examined over 24 months following periods of significantly heightened operational activities. This enabled empirical measurement of the variables as substantial external pressures came into force. The rationale for the research is obvious. Nearly all organizations have some nexus of familial interconnectedness. Examining familial leadership networks does not provide further understanding of how terrorism and insurgency originate, however, the central focus of the research does address how they persist. The sparse attention to this in existing literature presents an unexplored yet important area of security studies. Furthermore, social capital in familial systems is largely automatic and organic, given at birth or through kinship. It reduces security vetting cost for recruits, fighters and supporters which lowers liabilities and entry costs, while raising organizational efficiency and exit costs. Better understanding of these process is needed to exploit strengths into weaknesses. Outcomes and implications of the research have critical relevance to future operational policy development. Increased clarity of internal trust dynamics, social capital and power flows are essential to fracturing and manipulating kinship nexus. This is highly valuable to external pressure mechanisms such as counter-terrorism, counterinsurgency, and strategic intelligence methods to penetrate, manipulate, degrade or destroy the resilience of politically violent organizations.

Keywords: Counterinsurgency (COIN), counter-terrorism, familial influence, insurgency, intelligence, kinship, resilience, terrorism

Procedia PDF Downloads 297
2484 Improving the Frequency Response of a Circular Dual-Mode Resonator with a Reconfigurable Bandwidth

Authors: Muhammad Haitham Albahnassi, Adnan Malki, Shokri Almekdad

Abstract:

In this paper, a method for reconfiguring bandwidth in a circular dual-mode resonator is presented. The method concerns the optimized geometry of a structure that may be used to host the tuning elements, which are typically RF (Radio Frequency) switches. The tuning elements themselves, and their performance during tuning, are not the focus of this paper. The designed resonator is able to reconfigure its fractional bandwidth by adjusting the inter-coupling level between the degenerate modes, while at the same time improving its response by adjusting the external-coupling level and keeping the center frequency fixed. The inter-coupling level has been adjusted by changing the dimensions of the perturbation element, while the external-coupling level has been adjusted by changing one of the feeder dimensions. The design was arrived at via optimization. Agreeing simulation and measurement results of the designed and implemented filters showed good improvements in return loss values and the stability of the center frequency.

Keywords: dual-mode resonators, perturbation theory, reconfigurable filters, software defined radio, cognitine radio

Procedia PDF Downloads 148
2483 Analysis of Long-Term Response of Seawater to Change in CO₂, Heavy Metals and Nutrients Concentrations

Authors: Igor Povar, Catherine Goyet

Abstract:

The seawater is subject to multiple external stressors (ES) including rising atmospheric CO2 and ocean acidification, global warming, atmospheric deposition of pollutants and eutrophication, which deeply alter its chemistry, often on a global scale and, in some cases, at the degree significantly exceeding that in the historical and recent geological verification. In ocean systems the micro- and macronutrients, heavy metals, phosphor- and nitrogen-containing components exist in different forms depending on the concentrations of various other species, organic matter, the types of minerals, the pH etc. The major limitation to assessing more strictly the ES to oceans, such as pollutants (atmospheric greenhouse gas, heavy metals, nutrients as nitrates and phosphates) is the lack of theoretical approach which could predict the ocean resistance to multiple external stressors. In order to assess the abovementioned ES, the research has applied and developed the buffer theory approach and theoretical expressions of the formal chemical thermodynamics to ocean systems, as heterogeneous aqueous systems. The thermodynamic expressions of complex chemical equilibria, involving acid-base, complex formation and mineral ones have been deduced. This thermodynamic approach utilizes thermodynamic relationships coupled with original mass balance constraints, where the solid phases are explicitly expressed. The ocean sensitivity to different external stressors and changes in driving factors are considered in terms of derived buffering capacities or buffer factors for heterogeneous systems. Our investigations have proved that the heterogeneous aqueous systems, as ocean and seas are, manifest their buffer properties towards all their components, not only to pH, as it has been known so far, for example in respect to carbon dioxide, carbonates, phosphates, Ca2+, Mg2+, heavy metal ions etc. The derived expressions make possible to attribute changes in chemical ocean composition to different pollutants. These expressions are also useful for improving the current atmosphere-ocean-marine biogeochemistry models. The major research questions, to which the research responds, are: (i.) What kind of contamination is the most harmful for Future Ocean? (ii.) What are chemical heterogeneous processes of the heavy metal release from sediments and minerals and its impact to the ocean buffer action? (iii.) What will be the long-term response of the coastal ocean to the oceanic uptake of anthropogenic pollutants? (iv.) How will change the ocean resistance in terms of future chemical complex processes and buffer capacities and its response to external (anthropogenic) perturbations? The ocean buffer capacities towards its main components are recommended as parameters that should be included in determining the most important ocean factors which define the response of ocean environment at the technogenic loads increasing. The deduced thermodynamic expressions are valid for any combination of chemical composition, or any of the species contributing to the total concentration, as independent state variable.

Keywords: atmospheric greenhouse gas, chemical thermodynamics, external stressors, pollutants, seawater

Procedia PDF Downloads 124