Search results for: data interpolating empirical orthogonal function
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 29462

Search results for: data interpolating empirical orthogonal function

25832 Exploring the Contribution of Dynamic Capabilities to a Firm's Value Creation: The Role of Competitive Strategy

Authors: Mona Rashidirad, Hamid Salimian

Abstract:

Dynamic capabilities, as the most considerable capabilities of firms in the current fast-moving economy may not be sufficient for performance improvement, but their contribution to performance is undeniable. While much of the extant literature investigates the impact of dynamic capabilities on organisational performance, little attention has been devoted to understand whether and how dynamic capabilities create value. Dynamic capabilities as the mirror of competitive strategies should enable firms to search and seize new ideas, integrate and coordinate the firm’s resources and capabilities in order to create value. A careful investigation to the existing knowledge base remains us puzzled regarding the relationship among competitive strategies, dynamic capabilities and value creation. This study thus attempts to fill in this gap by empirically investigating the impact of dynamic capabilities on value creation and the mediating impact of competitive strategy on this relationship. We aim to contribute to dynamic capability view (DCV), in both theoretical and empirical senses, by exploring the impact of dynamic capabilities on firms’ value creation and whether competitive strategy can play any role in strengthening/weakening this relationship. Using a sample of 491 firms in the UK telecommunications market, the results demonstrate that dynamic sensing, learning, integrating and coordinating capabilities play a significant role in firm’s value creation, and competitive strategy mediates the impact of dynamic capabilities on value creation. Adopting DCV, this study investigates whether the value generating from dynamic capabilities depends on firms’ competitive strategy. This study argues a firm’s competitive strategy can mediate its ability to derive value from its dynamic capabilities and it explains the extent a firm’s competitive strategy may influence its value generation. The results of the dynamic capabilities-value relationships support our expectations and justify the non-financial value added of the four dynamic capability processes in a highly turbulent market, such as UK telecommunications. Our analytical findings of the relationship among dynamic capabilities, competitive strategy and value creation provide further evidence of the undeniable role of competitive strategy in deriving value from dynamic capabilities. The results reinforce the argument for the need to consider the mediating impact of organisational contextual factors, such as firm’s competitive strategy to examine how they interact with dynamic capabilities to deliver value. The findings of this study provide significant contributions to theory. Unlike some previous studies which conceptualise dynamic capabilities as a unidimensional construct, this study demonstrates the benefits of understanding the details of the link among the four types of dynamic capabilities, competitive strategy and value creation. In terms of contributions to managerial practices, this research draws attention to the importance of competitive strategy in conjunction with development and deployment of dynamic capabilities to create value. Managers are now equipped with solid empirical evidence which explains why DCV has become essential to firms in today’s business world.

Keywords: dynamic capabilities, resource based theory, value creation, competitive strategy

Procedia PDF Downloads 232
25831 Attribute Selection for Preference Functions in Engineering Design

Authors: Ali E. Abbas

Abstract:

Industrial Engineering is a broad multidisciplinary field with intersections and applications in numerous areas. When designing a product, it is important to determine the appropriate attributes of value and the preference function for which the product is optimized. This paper provides some guidelines on appropriate selection of attributes for preference and value functions for engineering design.

Keywords: decision analysis, industrial engineering, direct vs. indirect values, engineering management

Procedia PDF Downloads 293
25830 Identifying Psychosocial, Autonomic, and Pain Sensitivity Risk Factors of Chronic Temporomandibular Disorder by Using Ridge Logistic Regression and Bootstrapping

Authors: Haolin Li, Eric Bair, Jane Monaco, Quefeng Li

Abstract:

The temporomandibular disorder (TMD) is a series of musculoskeletal disorders ranging from jaw pain to chronic debilitating pain, and the risk factors for the onset and maintenance of TMD are still unclear. Prior researches have shown that the potential risk factors for chronic TMD are related to psychosocial factors, autonomic functions, and pain sensitivity. Using data from the Orofacial Pain: Prospective Evaluation and Risk Assessment (OPPERA) study’s baseline case-control study, we examine whether the risk factors identified by prior researches are still statistically significant after taking all of the risk measures into account in one single model, and we also compare the relative influences of the risk factors in three different perspectives (psychosocial factors, autonomic functions, and pain sensitivity) on the chronic TMD. The statistical analysis is conducted by using ridge logistic regression and bootstrapping, in which the performance of the algorithms has been assessed using extensive simulation studies. The results support most of the findings of prior researches that there are many psychosocial and pain sensitivity measures that have significant associations with chronic TMD. However, it is surprising that most of the risk factors of autonomic functions have not presented significant associations with chronic TMD, as described by a prior research.

Keywords: autonomic function, OPPERA study, pain sensitivity, psychosocial measures, temporomandibular disorder

Procedia PDF Downloads 169
25829 The Importance of Working Memory, Executive and Attention Functions in Attention Deficit Hyperactivity Disorder and Learning Disabilities Diagnostics

Authors: Dorottya Horváth, Tímea Harmath-Tánczos

Abstract:

Attention deficit hyperactivity disorder (ADHD) and learning disabilities are common neurocognitive disorders that can have a significant impact on a child's academic performance. ADHD is characterized by inattention, hyperactivity, and impulsivity, while learning disabilities are characterized by difficulty with specific academic skills, such as reading, writing, or math. The aim of this study was to investigate the working memory, executive, and attention functions of neurotypical children and children with ADHD and learning disabilities in order to fill the gaps in the Hungarian mean test scores of these cognitive functions in children with neurocognitive disorders. Another aim was to specify the neuropsychological differential diagnostic toolkit in terms of the relationships and peculiarities between these cognitive functions. The research question addressed in this study was: How do the working memory, executive, and attention functions of neurotypical children compare to those of children with ADHD and learning disabilities? A self-administered test battery was used as a research tool. Working memory was measured with the Non-Word Repetition Test, the Listening Span Test, the Digit Span Test, and the Reverse Digit Span Test; executive function with the Letter Fluency, Semantic Fluency, and Verb Fluency Tests; and attentional concentration with the d2-R Test. The data for this study was collected from 115 children aged 9-14 years. The children were divided into three groups: neurotypical children (n = 44), children with ADHD without learning disabilities (n = 23), and children with ADHD with learning disabilities (n = 48). The data was analyzed using a variety of statistical methods, including t-tests, ANOVAs, and correlational analyses. The results showed that the performance of children with neurocognitive involvement in working memory, executive functions, and attention was significantly lower than the performance of neurotypical children. However, the results of children with ADHD and ADHD with learning disabilities did not show a significant difference. The findings of this study are important because they provide new insights into the cognitive profiles of children with ADHD and learning disabilities and suggest that working memory, executive functions, and attention are all impaired in children with neurocognitive involvement, regardless of whether they have ADHD or learning disabilities. This information can be used to develop more effective diagnostic and treatment strategies for these disorders.

Keywords: ADHD, attention functions, executive functions, learning disabilities, working memory

Procedia PDF Downloads 79
25828 Antecedents of MNE Performance and Managing Firm-Specific and Country-Specific Advantages: An Empirical Study of Optoelectronics Industry in Taiwan

Authors: Jyh-Yi Shih, Chie-Bein Chen, Kuang-Yi Lin, Yu-Wei Huang

Abstract:

Because of the trend toward globalization, Taiwanese companies have gradually focused more on overseas market operations. Overseas market performance has gradually increased as a proportion of Taiwanese companies’ total business revenues. Existing international investment theories cannot explain numerous new phenomena in this domain. Opinions are inconsistent, and contradictory positions exist regarding the antecedents of multinational enterprise (MNE) performance. This study applied contemporary internalization theory to establish and extend approaches adopted by previous relevant studies. In the context of the overseas market, the influence that MNE investment in research and development (R&D) and marketing has on enterprise performance was investigated from the firm-specific advantages (FSAs) and country-specific advantages (CSAs) perspectives. CSAs and internationalization speed were addressed as moderators, and hypotheses regarding how internationalization and performance were achieved through MNE overseas market operation were explored to ensure the completeness of the investigation. The list of enterprises was sourced from the Taiwan Economic Journal. After examining the relevant data, the following conclusions were obtained: (a) The relationship between the level of FSAs in R&D and enterprise performance exhibited an S-shaped curve. (b) The relationship between the level of FSAs in marketing and enterprise performance displayed a U-shaped curve. (c) The extent to which potential CFAs were obtained positively moderated the relationship between enterprise investment in R&D to gain FSAs and MNE performance. (d) Internationalization speed positively moderated the relationship between MNEs and enterprise investment in R&D and marketing to gain FSAs.

Keywords: multinational corporation, firm-specific advantages, country-specific advantages, international speed

Procedia PDF Downloads 377
25827 Research and Implementation of Cross-domain Data Sharing System in Net-centric Environment

Authors: Xiaoqing Wang, Jianjian Zong, Li Li, Yanxing Zheng, Jinrong Tong, Mao Zhan

Abstract:

With the rapid development of network and communication technology, a great deal of data has been generated in different domains of a network. These data show a trend of increasing scale and more complex structure. Therefore, an effective and flexible cross-domain data-sharing system is needed. The Cross-domain Data Sharing System(CDSS) in a net-centric environment is composed of three sub-systems. The data distribution sub-system provides data exchange service through publish-subscribe technology that supports asynchronism and multi-to-multi communication, which adapts to the needs of the dynamic and large-scale distributed computing environment. The access control sub-system adopts Attribute-Based Access Control(ABAC) technology to uniformly model various data attributes such as subject, object, permission and environment, which effectively monitors the activities of users accessing resources and ensures that legitimate users get effective access control rights within a legal time. The cross-domain access security negotiation subsystem automatically determines the access rights between different security domains in the process of interactive disclosure of digital certificates and access control policies through trust policy management and negotiation algorithms, which provides an effective means for cross-domain trust relationship establishment and access control in a distributed environment. The CDSS’s asynchronous,multi-to-multi and loosely-coupled communication features can adapt well to data exchange and sharing in dynamic, distributed and large-scale network environments. Next, we will give CDSS new features to support the mobile computing environment.

Keywords: data sharing, cross-domain, data exchange, publish-subscribe

Procedia PDF Downloads 114
25826 Routing Protocol in Ship Dynamic Positioning Based on WSN Clustering Data Fusion System

Authors: Zhou Mo, Dennis Chow

Abstract:

In the dynamic positioning system (DPS) for vessels, the reliable information transmission between each note basically relies on the wireless protocols. From the perspective of cluster-based routing protocols for wireless sensor networks, the data fusion technology based on the sleep scheduling mechanism and remaining energy in network layer is proposed, which applies the sleep scheduling mechanism to the routing protocols, considering the remaining energy of node and location information when selecting cluster-head. The problem of uneven distribution of nodes in each cluster is solved by the Equilibrium. At the same time, Classified Forwarding Mechanism as well as Redelivery Policy strategy is adopted to avoid congestion in the transmission of huge amount of data, reduce the delay in data delivery and enhance the real-time response. In this paper, a simulation test is conducted to improve the routing protocols, which turn out to reduce the energy consumption of nodes and increase the efficiency of data delivery.

Keywords: DPS for vessel, wireless sensor network, data fusion, routing protocols

Procedia PDF Downloads 509
25825 Advanced Data Visualization Techniques for Effective Decision-making in Oil and Gas Exploration and Production

Authors: Deepak Singh, Rail Kuliev

Abstract:

This research article explores the significance of advanced data visualization techniques in enhancing decision-making processes within the oil and gas exploration and production domain. With the oil and gas industry facing numerous challenges, effective interpretation and analysis of vast and diverse datasets are crucial for optimizing exploration strategies, production operations, and risk assessment. The article highlights the importance of data visualization in managing big data, aiding the decision-making process, and facilitating communication with stakeholders. Various advanced data visualization techniques, including 3D visualization, augmented reality (AR), virtual reality (VR), interactive dashboards, and geospatial visualization, are discussed in detail, showcasing their applications and benefits in the oil and gas sector. The article presents case studies demonstrating the successful use of these techniques in optimizing well placement, real-time operations monitoring, and virtual reality training. Additionally, the article addresses the challenges of data integration and scalability, emphasizing the need for future developments in AI-driven visualization. In conclusion, this research emphasizes the immense potential of advanced data visualization in revolutionizing decision-making processes, fostering data-driven strategies, and promoting sustainable growth and improved operational efficiency within the oil and gas exploration and production industry.

Keywords: augmented reality (AR), virtual reality (VR), interactive dashboards, real-time operations monitoring

Procedia PDF Downloads 67
25824 Partnering With Faith-Based Entities to Improve Mental Health Awareness and Decrease Stigma in African American Communities

Authors: Bryana Woodard, Monica Mitchell, Kasey Harry, Ebony Washington, Megan Harris, Marcia Boyd, Regina Lynch, Daphene Baines, Surbi Bankar

Abstract:

Introduction: African Americans experience mental health illnesses (i.e., depression, anxiety, etc.) at higher rates than their white counterparts. Despite this, they utilize mental health resources less and have lower mental health literacy, perhaps due to cultural barriers- including but not limited to mistrust. Research acknowledges African Americans’ close ties to community networks, identifying these linkages as key to establishing comfort and trust. Similarly, the church has historically been a space that creates unity and community among African Americans. Studies show that longstanding academic-community partnerships with organizations, such as churches and faith-based entities, have the capability to effectively address health and mental health barriers and needs in African Americans. The importance of implementing faith-based approaches is supported in the literature, however few empirical studies exist. This project describes the First Ladies for Health and Cincinnati Children's Hospital Medical Center (CCHMC) Partnership (FLFH-CCHMC Partnership) and the implementation and assessment of an annual Mental Health Symposium, the overall aim of which was to increase mental health awareness and decrease stigma in African American communities. Methods: The specific goals of the FLFH Mental Health Symposium were to (1) Collaborate with trusted partners to build trust with community participants; (2) Increase mental health literacy and decrease mental health stigma; (3) Understand the barriers to improving mental health and improving trust; (4) Assess the short-term outcomes two months following the symposium. Data were collected through post-event and follow-up surveys using a mixed methods approach. Results: More than 100 participants attended each year with over 350 total participants over three years. 98.7% of participants were African American, 86.67% female, 11.6% male, and 11.6% LGBTQ+/non-binary; 10.5% of participants were teens, with the remainder aged 20 to 80 plus. The event was successful in achieving its goals: (1a) Eleven different speakers from 8 community and church organizations presented; (1b) 93% of participants rated the overall symposium as very good or excellent (2a) Mental health literacy significantly increased each year with over 90% of participants reporting improvement in their “understanding” and “awareness of mental health (2b) Participants 'personal stigma surrounding mental health illness decreased each year with 92.3% of participants reporting changes in their “willingness to talk about and share” mental health challenges; (3) Barriers to mental health care were identified and included social stigma, lack of trust, and the cost of care. Data were used to develop priorities and an action plan for the FLFH-CCHMC Mental Health Partnership; (4) Follow-up data showed that participants sustained benefits of the FLFH Symposium and took actionable steps (e.g., meditation, referrals, etc.). Additional quantitative and qualitative data will be shared. Conclusions: Lower rates of mental health literacy and higher rates of stigma among participants in this initiative demonstrate the importance of mental health providers building trust and partnerships in communities. Working with faith-based entities provides an opportunity to mitigate and address mental health equity in African American communities.

Keywords: community psychology, faith-based, african-american, culturally competent care, mental health equity

Procedia PDF Downloads 14
25823 The Data Quality Model for the IoT based Real-time Water Quality Monitoring Sensors

Authors: Rabbia Idrees, Ananda Maiti, Saurabh Garg, Muhammad Bilal Amin

Abstract:

IoT devices are the basic building blocks of IoT network that generate enormous volume of real-time and high-speed data to help organizations and companies to take intelligent decisions. To integrate this enormous data from multisource and transfer it to the appropriate client is the fundamental of IoT development. The handling of this huge quantity of devices along with the huge volume of data is very challenging. The IoT devices are battery-powered and resource-constrained and to provide energy efficient communication, these IoT devices go sleep or online/wakeup periodically and a-periodically depending on the traffic loads to reduce energy consumption. Sometime these devices get disconnected due to device battery depletion. If the node is not available in the network, then the IoT network provides incomplete, missing, and inaccurate data. Moreover, many IoT applications, like vehicle tracking and patient tracking require the IoT devices to be mobile. Due to this mobility, If the distance of the device from the sink node become greater than required, the connection is lost. Due to this disconnection other devices join the network for replacing the broken-down and left devices. This make IoT devices dynamic in nature which brings uncertainty and unreliability in the IoT network and hence produce bad quality of data. Due to this dynamic nature of IoT devices we do not know the actual reason of abnormal data. If data are of poor-quality decisions are likely to be unsound. It is highly important to process data and estimate data quality before bringing it to use in IoT applications. In the past many researchers tried to estimate data quality and provided several Machine Learning (ML), stochastic and statistical methods to perform analysis on stored data in the data processing layer, without focusing the challenges and issues arises from the dynamic nature of IoT devices and how it is impacting data quality. A comprehensive review on determining the impact of dynamic nature of IoT devices on data quality is done in this research and presented a data quality model that can deal with this challenge and produce good quality of data. This research presents the data quality model for the sensors monitoring water quality. DBSCAN clustering and weather sensors are used in this research to make data quality model for the sensors monitoring water quality. An extensive study has been done in this research on finding the relationship between the data of weather sensors and sensors monitoring water quality of the lakes and beaches. The detailed theoretical analysis has been presented in this research mentioning correlation between independent data streams of the two sets of sensors. With the help of the analysis and DBSCAN, a data quality model is prepared. This model encompasses five dimensions of data quality: outliers’ detection and removal, completeness, patterns of missing values and checks the accuracy of the data with the help of cluster’s position. At the end, the statistical analysis has been done on the clusters formed as the result of DBSCAN, and consistency is evaluated through Coefficient of Variation (CoV).

Keywords: clustering, data quality, DBSCAN, and Internet of things (IoT)

Procedia PDF Downloads 126
25822 New Security Approach of Confidential Resources in Hybrid Clouds

Authors: Haythem Yahyaoui, Samir Moalla, Mounir Bouden, Skander ghorbel

Abstract:

Nowadays, Cloud environments are becoming a need for companies, this new technology gives the opportunities to access to the data anywhere and anytime, also an optimized and secured access to the resources and gives more security for the data which stored in the platform, however, some companies do not trust Cloud providers, in their point of view, providers can access and modify some confidential data such as bank accounts, many works have been done in this context, they conclude that encryption methods realized by providers ensure the confidentiality, although, they forgot that Cloud providers can decrypt the confidential resources. The best solution here is to apply some modifications on the data before sending them to the Cloud in the objective to make them unreadable. This work aims on enhancing the quality of service of providers and improving the trust of the customers.

Keywords: cloud, confidentiality, cryptography, security issues, trust issues

Procedia PDF Downloads 364
25821 Using the Structural Equation Model to Explain the Effect of Supervisory Practices on Regulatory Density

Authors: Jill Round

Abstract:

In the economic system, the financial sector plays a crucial role as an intermediary between market participants, other financial institutions, and customers. Financial institutions such as banks have to make decisions to satisfy the demands of all the participants by keeping abreast of regulatory change. In recent years, progress has been made regarding frameworks, development of rules, standards, and processes to manage risks in the banking sector. The increasing focus of regulators and policymakers placed on risk management, corporate governance, and the organization’s culture is of special interest as it requires a well-resourced risk controlling function, compliance function, and internal audit function. In the past years, the relevance of these functions that make up the so-called Three Lines of Defense has moved from the backroom to the boardroom. The approach of the model can vary based on the various organizational characteristics. Due to the intense regulatory requirements, organizations operating in the financial sector have more mature models. In less regulated industries there is more cloudiness about what tasks are allocated where. All parties strive to achieve their objectives through the effective management of risks and serve the identical stakeholders. Today, the Three Lines of Defense model is used throughout the world. The research looks at trends and emerging issues in the professions of the Three Lines of Defense within the banking sector. The answers are believed to helping to explain the increasing regulatory requirements for the banking sector. While the number of supervisory practices increases the risk management requirements intensify and demand more regulatory compliance at the same time. The Structural Equation Modeling (SEM) is applied by making use of conducted surveys in the research field. It aims to describe (i) the theoretical model regarding the applicable linearity relationships, (ii) the causal relationship between multiple predictors (exogenous) and multiple dependent variables (endogenous), (iii) taking into consideration the unobservable variables and (iv) the measurement errors. The surveys conducted on the research field suggest that the observable variables are caused by various latent variables. The SEM consists of the 1) measurement model and the 2) structural model. There is a detectable correlation regarding the cause-effect relationship among the performed supervisory practices and the increasing scope of regulation. Supervisory practices reinforce the regulatory density. In the past, controls were placed after supervisory practices were conducted or incidents occurred. In further research, it is of interest to examine, whether risk management is proactive, reactive to incidents and supervisory practices or can be both at the same time.

Keywords: risk management, structural equation model, supervisory practice, three lines of defense

Procedia PDF Downloads 205
25820 A Doctrinal Research and Review of Hashtag Trademarks

Authors: Hetvi Trivedi

Abstract:

Technological escalation cannot be negated. The same is true for the benefits of technology. However, such escalation has interfered with the traditional theories of protection under Intellectual Property Rights. Out of the many trends that have disrupted the old-school understanding of Intellectual Property Rights, one is hashtags. What began modestly in the year 2007 has now earned a remarkable status, and coupled with the unprecedented rise in social media the hashtag culture has witnessed a monstrous growth. A tiny symbol on the keypad of phones or computers is now a major trend which also serves companies as a critical investment measure in establishing their brand in the market. Due to this a section of the Intellectual Property Rights- Trademarks is undergoing a humungous transformation with hashtags like #icebucket, #tbt or #smilewithacoke, getting trademark protection. So, as the traditional theories of IP take on the modern trends, it is necessary to understand the change and challenge at a theoretical and proportional level and where need be, question the change. Traditionally, Intellectual Property Rights serves the societal need for intellectual productions that ensure its holistic development as well as cultural, economic, social and technological progress. In a two-pronged effort at ensuring continuity of creativity, IPRs recognize the investment of individual efforts that go into creation by way of offering protection. Commonly placed under two major theories- Utilitarian and Natural, IPRs aim to accord protection and recognition to an individual’s creation or invention which serve as an incentive for further creations or inventions, thus fully protecting the creative, inventive or commercial labour invested in the same. In return, the creator by lending the public the access to the creation reaps various benefits. This way Intellectual Property Rights form a ‘social contract’ between the author and society. IPRs are similarly attached to a social function, whereby individual rights must be weighed against competing rights and to the farthest limit possible, both sets of rights must be treated in a balanced manner. To put it differently, both the society and the creator must be put on an equal footing with neither party’s rights subservient to the other. A close look through doctrinal research, at the recent trend of trademark protection, makes the social function of IPRs seem to be moving far from the basic philosophy. Thus, where technology interferes with the philosophies of law, it is important to check and allow such growth only in moderation, for none is superior than the other. The human expansionist nature may need everything under the sky that can be tweaked slightly to be counted and protected as Intellectual Property- like a common parlance word transformed into a hashtag, however IP in order to survive on its philosophies needs to strike a balance. A unanimous global decision on the judicious use of IPR recognition and protection is the need of the hour.

Keywords: hashtag trademarks, intellectual property, social function, technology

Procedia PDF Downloads 120
25819 A Qualitative Assessment of the Internal Communication of the College of Comunication: Basis for a Strategic Communication Plan

Authors: Edna T. Bernabe, Joshua Bilolo, Sheila Mae Artillero, Catlicia Joy Caseda, Liezel Once, Donne Ynah Grace Quirante

Abstract:

Internal communication is significant for an organization to function to its full extent. A strategic communication plan builds an organization’s structure and makes it more systematic. Information is a vital part of communication inside the organization as this lays every possible outcome—be it positive or negative. It is, therefore, imperative to assess the communication structure of a particular organization to secure a better and harmonious communication environment in any organization. Thus, this research was intended to identify the internal communication channels used in Polytechnic University of the Philippines-College of Communication (PUP-COC) as an organization, to identify the flow of information specifically in downward, upward, and horizontal communication, to assess the accuracy, consistency, and timeliness of its internal communication channels; and to come up with a proposed strategic communication plan of information dissemination to improve the existing communication flow in the college. The researchers formulated a framework from Input-Throughout-Output-Feedback-Goal of General System Theory and gathered data to assess the PUP-COC’s internal communication. The communication model links the objectives of the study to know the internal organization of the college. The qualitative approach and case study as the tradition of inquiry were used to gather deeper understanding of the internal organizational communication in PUP-COC, using Interview, as the primary methods for the study. This was supported with a quantitative data which were gathered through survey from the students of the college. The researchers interviewed 17 participants: the College dean, the 4 chairpersons of the college departments, the 11 faculty members and staff, and the acting Student Council president. An interview guide and a standardized questionnaire were formulated as instruments to generate the data. After a thorough analysis of the study, it was found out that two-way communication flow exists in PUP-COC. The type of communication channel the internal stakeholders use varies as to whom a particular person is communicating with. The members of the PUP-COC community also use different types of communication channels depending on the flow of communication being used. Moreover, the most common types of internal communication are the letters and memoranda for downward communication, while letters, text messages, and interpersonal communication are often used in upward communication. Various forms of social media have been found out to be of use in horizontal communication. Accuracy, consistency, and timeliness play a significant role in information dissemination within the college. However, some problems have also been found out in the communication system. The most common problem are the delay in the dissemination of memoranda and letters and the uneven distribution of information and instruction to faculty, staff, and students. This has led the researchers to formulate a strategic communication plan which aims to propose strategies that will solve the communication problems that are being experienced by the internal stakeholders.

Keywords: communication plan, downward communication, internal communication, upward communication

Procedia PDF Downloads 499
25818 The Challenge of Assessing Social AI Threats

Authors: Kitty Kioskli, Theofanis Fotis, Nineta Polemi

Abstract:

The European Union (EU) directive Artificial Intelligence (AI) Act in Article 9 requires that risk management of AI systems includes both technical and human oversight, while according to NIST_AI_RFM (Appendix C) and ENISA AI Framework recommendations, claim that further research is needed to understand the current limitations of social threats and human-AI interaction. AI threats within social contexts significantly affect the security and trustworthiness of the AI systems; they are interrelated and trigger technical threats as well. For example, lack of explainability (e.g. the complexity of models can be challenging for stakeholders to grasp) leads to misunderstandings, biases, and erroneous decisions. Which in turn impact the privacy, security, accountability of the AI systems. Based on the NIST four fundamental criteria for explainability it can also classify the explainability threats into four (4) sub-categories: a) Lack of supporting evidence: AI systems must provide supporting evidence or reasons for all their outputs. b) Lack of Understandability: Explanations offered by systems should be comprehensible to individual users. c) Lack of Accuracy: The provided explanation should accurately represent the system's process of generating outputs. d) Out of scope: The system should only function within its designated conditions or when it possesses sufficient confidence in its outputs. Biases may also stem from historical data reflecting undesired behaviors. When present in the data, biases can permeate the models trained on them, thereby influencing the security and trustworthiness of the of AI systems. Social related AI threats are recognized by various initiatives (e.g., EU Ethics Guidelines for Trustworthy AI), standards (e.g. ISO/IEC TR 24368:2022 on AI ethical concerns, ISO/IEC AWI 42105 on guidance for human oversight of AI systems) and EU legislation (e.g. the General Data Protection Regulation 2016/679, the NIS 2 Directive 2022/2555, the Directive on the Resilience of Critical Entities 2022/2557, the EU AI Act, the Cyber Resilience Act). Measuring social threats, estimating the risks to AI systems associated to these threats and mitigating them is a research challenge. In this paper it will present the efforts of two European Commission Projects (FAITH and THEMIS) from the HorizonEurope programme that analyse the social threats by building cyber-social exercises in order to study human behaviour, traits, cognitive ability, personality, attitudes, interests, and other socio-technical profile characteristics. The research in these projects also include the development of measurements and scales (psychometrics) for human-related vulnerabilities that can be used in estimating more realistically the vulnerability severity, enhancing the CVSS4.0 measurement.

Keywords: social threats, artificial Intelligence, mitigation, social experiment

Procedia PDF Downloads 50
25817 Estimation of Chronic Kidney Disease Using Artificial Neural Network

Authors: Ilker Ali Ozkan

Abstract:

In this study, an artificial neural network model has been developed to estimate chronic kidney failure which is a common disease. The patients’ age, their blood and biochemical values, and 24 input data which consists of various chronic diseases are used for the estimation process. The input data have been subjected to preprocessing because they contain both missing values and nominal values. 147 patient data which was obtained from the preprocessing have been divided into as 70% training and 30% testing data. As a result of the study, artificial neural network model with 25 neurons in the hidden layer has been found as the model with the lowest error value. Chronic kidney failure disease has been able to be estimated accurately at the rate of 99.3% using this artificial neural network model. The developed artificial neural network has been found successful for the estimation of chronic kidney failure disease using clinical data.

Keywords: estimation, artificial neural network, chronic kidney failure disease, disease diagnosis

Procedia PDF Downloads 432
25816 Restoring Total Form and Function in Patients with Lower Limb Bony Defects Utilizing Patient-Specific Fused Deposition Modelling- A Neoteric Multidisciplinary Reconstructive Approach

Authors: Divya SY. Ang, Mark B. Tan, Nicholas EM. Yeo, Siti RB. Sudirman, Khong Yik Chew

Abstract:

Introduction: The importance of the amalgamation of technological and engineering advances with surgical principles of reconstruction cannot be overemphasized. With earlier detection of cancer, consequences of high-speed living and neglect, like traumatic injuries and infection, resulting in increasingly younger patients with bone defects. This may result in malformations and suboptimal function that is more noticeable and palpable in the younger, active demographic. Our team proposes a technique that encapsulates a mesh of multidisciplinary effort, tissue engineering and reconstructive principles. Methods/Materials: Our patient was a young competitive footballer in his early 30s who was diagnosed with submandibular adenoid cystic carcinoma with bony involvement. He was thus counselled for a right hemi mandibulectomy, the floor of mouth resection, right selective neck dissection, tracheostomy, and free fibular flap reconstruction of his mandible and required post-operative radiotherapy. Being young and in his prime sportsman years, he was unable to accept the morbidities associated with using his fibula to reconstruct his mandible despite it being the gold standard reconstructive option. The fibula is an ideal vascularized bone flap because it’s reliable and easily shaped with relatively minimal impact on functional outcomes. The fibula contributes to 30% of weightbearing and is the attachment for the lateral compartment muscles; it is stronger in footballers concerning lateral bending. When harvesting the fibula, the distal 6-8cm and up to 10% of the total length is preserved to maintain the ankle’s stability, thus, minimizing the impact on daily activities. There are studies that have noted gait variability post-operatively. Therefore, returning to a premorbid competitive level may be doubtful. To improve his functional outcomes, the decision was made to try and restore the fibula's form and function. Using the concept of Fused Deposition Modelling (FDM), our team comprising of Plastics, Otolaryngology, Orthopedics and Radiology, worked with Osteopore to design a 3D bioresorbable implant to regenerate the fibula defect (14.5cm). Bone marrow was harvested via reaming the contralateral hip prior to the wide resection. 30mls of his blood was obtained for extracting platelet rich plasma. These were packed into the Osteopore 3D-printed bone scaffold. This was then secured into the fibula defect with titanium plates and screws. The flexor hallucis longus and soleus were anchored along the construct and intraosseous membrane, done in a single setting. Results: He was reviewed closely as an outpatient over 10 months post operatively. He reported no discernable loss or difference in ankle function. He is satisfied and back in training and our team has video and photographs that substantiate his progress. Conclusion: FDM allows regeneration of long bone defects. However, we aimed to also restore his eversion and inversion that is imperative for footballers and hence reattached his previously dissected muscles along the length of the Osteopore implant. We believe that the reattachment of the muscle stabilizes not only the construct but allows optimum muscle tensioning when moving his ankle. This is a simple but effective technique in restoring complete function and form in a young patient whose minute muscle control is imperative to life.

Keywords: fused deposition modelling, functional reconstruction, lower limb bony defects, regenerative surgery, 3D printing, tissue engineering

Procedia PDF Downloads 59
25815 Application of Applied Behavior Analysis Treatment to Children with Down Syndrome

Authors: Olha Yarova

Abstract:

This study is a collaborative project between the American University of Central Asia and parent association of children with Down syndrome ‘Sunterra’ that took place in Bishkek, Kyrgyzstan. The purpose of the study was to explore whether principles and techniques of applied behavior analysis (ABA) could be used to teach children with Down syndrome socially significant behaviors. ABA is considered to be one of the most effective treatment for children with autism, but little research is done on the particularity of using ABA to children with Down syndrome. The data for the study was received during clinical observations; work with children with Down syndrome and interviews with their mothers. The results show that many ABA principles make the work with children with Down syndrome more effective. Although such children very rarely demonstrate aggressive behavior, they show a lot of escape-driven and attention seeking behaviors that are reinforced by their parents and educators. Thus functional assessment can be done to assess the function of problem behavior and to determine appropriate treatment. Prompting and prompting fading should be used to develop receptive and expressive language skills, and enhance motor development. Even though many children with Down syndrome work for praise, it is still relevant to use tangible reinforcement and to know how to remove them. Based on the results of the study, the training for parents of children with Down syndrome will be developed in Kyrgyzstan, country, where children with Down syndrome are not accepted to regular kindergartens and where doctors in maternity hospitals tell parents that their child will never talk, walk and recognize them

Keywords: down syndrome, applied behavior analysis, functional assessment, problem behavior, reinforcement

Procedia PDF Downloads 261
25814 Impact of Map Generalization in Spatial Analysis

Authors: Lin Li, P. G. R. N. I. Pussella

Abstract:

When representing spatial data and their attributes on different types of maps, the scale plays a key role in the process of map generalization. The process is consisted with two main operators such as selection and omission. Once some data were selected, they would undergo of several geometrical changing processes such as elimination, simplification, smoothing, exaggeration, displacement, aggregation and size reduction. As a result of these operations at different levels of data, the geometry of the spatial features such as length, sinuosity, orientation, perimeter and area would be altered. This would be worst in the case of preparation of small scale maps, since the cartographer has not enough space to represent all the features on the map. What the GIS users do is when they wanted to analyze a set of spatial data; they retrieve a data set and does the analysis part without considering very important characteristics such as the scale, the purpose of the map and the degree of generalization. Further, the GIS users use and compare different maps with different degrees of generalization. Sometimes, GIS users are going beyond the scale of the source map using zoom in facility and violate the basic cartographic rule 'it is not suitable to create a larger scale map using a smaller scale map'. In the study, the effect of map generalization for GIS analysis would be discussed as the main objective. It was used three digital maps with different scales such as 1:10000, 1:50000 and 1:250000 which were prepared by the Survey Department of Sri Lanka, the National Mapping Agency of Sri Lanka. It was used common features which were on above three maps and an overlay analysis was done by repeating the data with different combinations. Road data, River data and Land use data sets were used for the study. A simple model, to find the best place for a wild life park, was used to identify the effects. The results show remarkable effects on different degrees of generalization processes. It can see that different locations with different geometries were received as the outputs from this analysis. The study suggests that there should be reasonable methods to overcome this effect. It can be recommended that, as a solution, it would be very reasonable to take all the data sets into a common scale and do the analysis part.

Keywords: generalization, GIS, scales, spatial analysis

Procedia PDF Downloads 320
25813 Reliability Based Topology Optimization: An Efficient Method for Material Uncertainty

Authors: Mehdi Jalalpour, Mazdak Tootkaboni

Abstract:

We present a computationally efficient method for reliability-based topology optimization under material properties uncertainty, which is assumed to be lognormally distributed and correlated within the domain. Computational efficiency is achieved through estimating the response statistics with stochastic perturbation of second order, using these statistics to fit an appropriate distribution that follows the empirical distribution of the response, and employing an efficient gradient-based optimizer. The proposed algorithm is utilized for design of new structures and the changes in the optimized topology is discussed for various levels of target reliability and correlation strength. Predictions were verified thorough comparison with results obtained using Monte Carlo simulation.

Keywords: material uncertainty, stochastic perturbation, structural reliability, topology optimization

Procedia PDF Downloads 593
25812 Uptake of Copper by Dead Biomass of Burkholderia cenocepacia Isolated from a Metal Mine in Pará, Brazil

Authors: Ingrid R. Avanzi, Marcela dos P. G. Baltazar, Louise H. Gracioso, Luciana J. Gimenes, Bruno Karolski, Elen A. Perpetuo, Claudio Auguto Oller do Nascimento

Abstract:

In this study was developed a natural process using a biological system for the uptake of Copper and possible removal of copper from wastewater by dead biomass of the strain Burkholderia cenocepacia. Dead and live biomass of Burkholderia cenocepacia was used to analyze the equilibrium and kinetics of copper biosorption by this strain in function of the pH. Living biomass exhibited the highest biosorption capacity of copper, 50 mg g−1, which was achieved within 5 hours of contact, at pH 7.0, temperature of 30°C, and agitation speed of 150 rpm. The dead biomass of Burkholderia cenocepacia may be considered an efficiently bioprocess, being fast and low-cost to production of copper and also a probably nano-adsorbent of this metal ion in wastewater in bioremediation process. In this study was developed a natural process using a biological system for the uptake of Copper and possible removal of copper from wastewater by dead biomass of the strain Burkholderia cenocepacia. Dead and live biomass of Burkholderia cenocepacia was used to analyze the equilibrium and kinetics of copper biosorption by this strain in function of the pH. Living biomass exhibited the highest biosorption capacity of copper, 50 mg g−1, which was achieved within 5 hours of contact, at pH 7.0, temperature of 30°C, and agitation speed of 150 rpm. The dead biomass of Burkholderia cenocepacia may be considered an efficiently bioprocess, being fast and low-cost to production of copper and also a probably nano-adsorbent of this metal ion in wastewater in bioremediation process.

Keywords: biosorption, dead biomass, biotechnology, copper recovery

Procedia PDF Downloads 328
25811 Identity Verification Based on Multimodal Machine Learning on Red Green Blue (RGB) Red Green Blue-Depth (RGB-D) Voice Data

Authors: LuoJiaoyang, Yu Hongyang

Abstract:

In this paper, we experimented with a new approach to multimodal identification using RGB, RGB-D and voice data. The multimodal combination of RGB and voice data has been applied in tasks such as emotion recognition and has shown good results and stability, and it is also the same in identity recognition tasks. We believe that the data of different modalities can enhance the effect of the model through mutual reinforcement. We try to increase the three modalities on the basis of the dual modalities and try to improve the effectiveness of the network by increasing the number of modalities. We also implemented the single-modal identification system separately, tested the data of these different modalities under clean and noisy conditions, and compared the performance with the multimodal model. In the process of designing the multimodal model, we tried a variety of different fusion strategies and finally chose the fusion method with the best performance. The experimental results show that the performance of the multimodal system is better than that of the single modality, especially in dealing with noise, and the multimodal system can achieve an average improvement of 5%.

Keywords: multimodal, three modalities, RGB-D, identity verification

Procedia PDF Downloads 60
25810 Optimization the Multiplicity of Infection for Large Produce of Lytic Bacteriophage pAh6-C

Authors: Sang Guen Kim, Sib Sankar Giri, Jin Woo Jun, Saekil Yun, Hyoun Joong Kim, Sang Wha Kim, Jung Woo Kang, Se Jin Han, Se Chang Park

Abstract:

Emerging of the super bacteria, bacteriophages are considered to be as an alternative to antibiotics. As the demand of phage increased, economical and large production of phage is becoming one of the critical points. For the therapeutic use, what is important is to eradicate the pathogenic bacteria as fast as possible, so higher concentration of phages is generally needed for effective therapeutic function. On the contrary, for the maximum production, bacteria work as a phage producing factory. As a microbial cell factory, bacteria is needed to last longer producing the phages without eradication. Consequently, killing the bacteria fast has a negative effect on large production. In this study, Multiplicity of Infection (MOI) was manipulated based on initial bacterial inoculation and used phage pAh-6C which has therapeutic effect against Aeromonas hydrophila. 1, 5 and 10 percent of overnight bacterial culture was inoculated and each bacterial culture was co-cultured with the phage of which MOI of 0.01, 0.0001, and 0.000001 respectively. Simply changing the initial MOI as well as bacterial inoculation concentration has regulated the production quantity of the phage without any other changes to culture conditions. It is anticipated that this result can be used as a foundational data for mass production of lytic bacteriophages which can be used as the therapeutic bio-control agent.

Keywords: bacteriophage, multiplicity of infection, optimization, Aeromonas hydrophila

Procedia PDF Downloads 291
25809 Modeling of Physico-Chemical Characteristics of Concrete for Filling Trenches in Radioactive Waste Management

Authors: Ilija Plecas, Dalibor Arbutina

Abstract:

The leaching rate of 60Co from spent mix bead (anion and cation) exchange resins in a cement-bentonite matrix has been studied. Transport phenomena involved in the leaching of a radioactive material from a cement-bentonite matrix are investigated using three methods based on theoretical equations. These are: the diffusion equation for a plane source, an equation for diffusion coupled to a first order equation and an empirical method employing a polynomial equation. The results presented in this paper are from a 25-year mortar and concrete testing project that will influence the design choices for radioactive waste packaging for a future Serbian radioactive waste disposal center.

Keywords: cement, concrete, immobilization, leaching, permeability, radioactivity, waste

Procedia PDF Downloads 306
25808 The Study of Difficulties of Understanding Idiomatic Expressions Encountered by Translators 2021

Authors: Mohamed Elmogbail

Abstract:

The present study aimed at investigating difficulties those Translators encounter in understanding idiomatic expressions between Arabic and English languages. To achieve this goal, the researcher raised the three questions are:(1) What are the major difficulties that translators encounter in translating idiomatic expressions? (2) What factors cause such difficulties that translators encountered in translating idiomatic expressions? (3) What are the possible techniques that should be followed to overcome these difficulties? To answer these questions, the researcher designed questionnaire Table (2) and mentioned tables related to Test Show the second question in the study is about the factors that stand behind the challenges. Translators encounter while translating idiomatic expressions. The translators asked Provided the following factors:1- Because of lack of exposure to the source culture, they do not know the connotations of the cultural words that are related to the environment, food, folklore 2- Misusing dictionaries made the participants unable to find a proper target language idiomatic expression. 3-Lack of using idiomatic expressions in daily life. Table (3): (Questionnaire) Results to the table (3) Questions Of the study are About suggestions that can be inferred to handle these challenges. The questioned translators provided the following solutions:1- translators must be exposed to source language culture, including religion, habits, and traditions.2- translators should also be exposed to source language idiomatic expressions by introducing English culture in textbooks and through participating in extensive English culture courses.3- translators should be familiar with the differences between source and target language cultures.4- translators should avoid literal translation that results in most cases in wrong or poor translation.5- Schools, universities, and institutions should introduce translators to English culture.6- translators should participate in cultural workshops at universities.7- translators should try to use idiomatic expressions in everyday situations.8- translators should read more idiomatic expressions books. And researcher also designed a translation test consisted of 40 excerpts given to a random sample of 100 Translators in Khartoum capital of Sudan to translate them. After Collected data for the study, the researcher proceeded to a more detailed analysis, the methodology used in the analysis of idiomatic expressions Is empirical and descriptive. This study is qualitative by nature, but the quantitative method used the analysis of the data. Some figure and statistics are used, such as (statistical package for the social sciences). The researcher calculated the percentage proportion of each translation expressions. And compared them to each other. The finding of the study showed that most translations are inadequate as the translators faced difficulties while communication, these difficulties were mostly due to their unfamiliarity with idiomatic expressions producing improper equivalence in the communication, and not being able to use translation techniques as required, and resorted to literal translation, furthermore, the study recommended that more comprehensive studies to executed on translating idiomatic expressions to enrich the translation field.

Keywords: translation, translators, idioms., expressions

Procedia PDF Downloads 134
25807 Non-Linear Causality Inference Using BAMLSS and Bi-CAM in Finance

Authors: Flora Babongo, Valerie Chavez

Abstract:

Inferring causality from observational data is one of the fundamental subjects, especially in quantitative finance. So far most of the papers analyze additive noise models with either linearity, nonlinearity or Gaussian noise. We fill in the gap by providing a nonlinear and non-gaussian causal multiplicative noise model that aims to distinguish the cause from the effect using a two steps method based on Bayesian additive models for location, scale and shape (BAMLSS) and on causal additive models (CAM). We have tested our method on simulated and real data and we reached an accuracy of 0.86 on average. As real data, we considered the causality between financial indices such as S&P 500, Nasdaq, CAC 40 and Nikkei, and companies' log-returns. Our results can be useful in inferring causality when the data is heteroskedastic or non-injective.

Keywords: causal inference, DAGs, BAMLSS, financial index

Procedia PDF Downloads 140
25806 Managing Incomplete PSA Observations in Prostate Cancer Data: Key Strategies and Best Practices for Handling Loss to Follow-Up and Missing Data

Authors: Madiha Liaqat, Rehan Ahmed Khan, Shahid Kamal

Abstract:

Multiple imputation with delta adjustment is a versatile and transparent technique for addressing univariate missing data in the presence of various missing mechanisms. This approach allows for the exploration of sensitivity to the missing-at-random (MAR) assumption. In this review, we outline the delta-adjustment procedure and illustrate its application for assessing the sensitivity to deviations from the MAR assumption. By examining diverse missingness scenarios and conducting sensitivity analyses, we gain valuable insights into the implications of missing data on our analyses, enhancing the reliability of our study's conclusions. In our study, we focused on assessing logPSA, a continuous biomarker in incomplete prostate cancer data, to examine the robustness of conclusions against plausible departures from the MAR assumption. We introduced several approaches for conducting sensitivity analyses, illustrating their application within the pattern mixture model (PMM) under the delta adjustment framework. This proposed approach effectively handles missing data, particularly loss to follow-up.

Keywords: loss to follow-up, incomplete response, multiple imputation, sensitivity analysis, prostate cancer

Procedia PDF Downloads 73
25805 Vibration-Based Data-Driven Model for Road Health Monitoring

Authors: Guru Prakash, Revanth Dugalam

Abstract:

A road’s condition often deteriorates due to harsh loading such as overload due to trucks, and severe environmental conditions such as heavy rain, snow load, and cyclic loading. In absence of proper maintenance planning, this results in potholes, wide cracks, bumps, and increased roughness of roads. In this paper, a data-driven model will be developed to detect these damages using vibration and image signals. The key idea of the proposed methodology is that the road anomaly manifests in these signals, which can be detected by training a machine learning algorithm. The use of various machine learning techniques such as the support vector machine and Radom Forest method will be investigated. The proposed model will first be trained and tested with artificially simulated data, and the model architecture will be finalized by comparing the accuracies of various models. Once a model is fixed, the field study will be performed, and data will be collected. The field data will be used to validate the proposed model and to predict the future road’s health condition. The proposed will help to automate the road condition monitoring process, repair cost estimation, and maintenance planning process.

Keywords: SVM, data-driven, road health monitoring, pot-hole

Procedia PDF Downloads 70
25804 General Architecture for Automation of Machine Learning Practices

Authors: U. Borasi, Amit Kr. Jain, Rakesh, Piyush Jain

Abstract:

Data collection, data preparation, model training, model evaluation, and deployment are all processes in a typical machine learning workflow. Training data needs to be gathered and organised. This often entails collecting a sizable dataset and cleaning it to remove or correct any inaccurate or missing information. Preparing the data for use in the machine learning model requires pre-processing it after it has been acquired. This often entails actions like scaling or normalising the data, handling outliers, selecting appropriate features, reducing dimensionality, etc. This pre-processed data is then used to train a model on some machine learning algorithm. After the model has been trained, it needs to be assessed by determining metrics like accuracy, precision, and recall, utilising a test dataset. Every time a new model is built, both data pre-processing and model training—two crucial processes in the Machine learning (ML) workflow—must be carried out. Thus, there are various Machine Learning algorithms that can be employed for every single approach to data pre-processing, generating a large set of combinations to choose from. Example: for every method to handle missing values (dropping records, replacing with mean, etc.), for every scaling technique, and for every combination of features selected, a different algorithm can be used. As a result, in order to get the optimum outcomes, these tasks are frequently repeated in different combinations. This paper suggests a simple architecture for organizing this largely produced “combination set of pre-processing steps and algorithms” into an automated workflow which simplifies the task of carrying out all possibilities.

Keywords: machine learning, automation, AUTOML, architecture, operator pool, configuration, scheduler

Procedia PDF Downloads 42
25803 The Impact of Corporate Finance on Financial Stability in the Western Balkan Countries

Authors: Luan Vardari, Dena Arapi-Vardari

Abstract:

Financial stability is a critical component of economic growth and development, and it has been recognized as a key policy objective in many countries around the world. In the Western Balkans, financial stability has been a key issue in recent years, with a number of challenges facing the region, including high levels of public debt, weak banking systems, and economic volatility. Corporate finance, which refers to the financial management practices of firms, is an important factor that can impact financial stability. This paper aims to investigate corporate finance's impact on financial stability in Western Balkan countries. This study will use a mixed-methods approach to investigate the impact of corporate finance on financial stability in the Western Balkans. The study will begin with a comprehensive review of the existing literature on corporate finance and financial stability, focusing on the Western Balkan region. This will be followed by an empirical analysis of regional corporate finance practices using data from various industries and firms. The analysis will explore the relationship between corporate finance practices and financial stability, taking into account factors such as regulatory frameworks, economic conditions, and firm size. The results of the study are expected to provide insights into the impact of corporate finance on financial stability in the Western Balkans. Specifically, the study will identify the key corporate finance practices that contribute to financial stability in the region, as well as the challenges and obstacles that firms face in implementing effective corporate finance strategies. The study will also provide recommendations for policymakers and firms looking to enhance financial stability and resilience in the region.

Keywords: financial regulation, debt management, investment decisions, dividend policies, economic volatility, banking systems, public debt, prudent financial management, firm size, policy recommendations

Procedia PDF Downloads 62