Search results for: complexity by fact
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4285

Search results for: complexity by fact

4135 Convex Restrictions for Outage Constrained MU-MISO Downlink under Imperfect Channel State Information

Authors: A. Preetha Priyadharshini, S. B. M. Priya

Abstract:

In this paper, we consider the MU-MISO downlink scenario, under imperfect channel state information (CSI). The main issue in imperfect CSI is to keep the probability of each user achievable outage rate below the given threshold level. Such a rate outage constraints present significant and analytical challenges. There are many probabilistic methods are used to minimize the transmit optimization problem under imperfect CSI. Here, decomposition based large deviation inequality and Bernstein type inequality convex restriction methods are used to perform the optimization problem under imperfect CSI. These methods are used for achieving improved output quality and lower complexity. They provide a safe tractable approximation of the original rate outage constraints. Based on these method implementations, performance has been evaluated in the terms of feasible rate and average transmission power. The simulation results are shown that all the two methods offer significantly improved outage quality and lower computational complexity.

Keywords: imperfect channel state information, outage probability, multiuser- multi input single output, channel state information

Procedia PDF Downloads 800
4134 Collaboration During Planning and Reviewing in Writing: Effects on L2 Writing

Authors: Amal Sellami, Ahlem Ammar

Abstract:

Writing is acknowledged to be a cognitively demanding and complex task. Indeed, the writing process is composed of three iterative sub-processes, namely planning, translating (writing), and reviewing. Not only do second or foreign language learners need to write according to this process, but they also need to respect the norms and rules of language and writing in the text to-be-produced. Accordingly, researchers have suggested to approach writing as a collaborative task in order to al leviate its complexity. Consequently, collaboration has been implemented during the whole writing process or only during planning orreviewing. Researchers report that implementing collaboration during the whole process might be demanding in terms of time in comparison to individual writing tasks. Consequently, because of time constraints, teachers may avoid it. For this reason, it might be pedagogically more realistic to limit collaboration to one of the writing sub-processes(i.e., planning or reviewing). However, previous research implementing collaboration in planning or reviewing is limited and fails to explore the effects of the seconditionson the written text. Consequently, the present study examines the effects of collaboration in planning and collaboration in reviewing on the written text. To reach this objective, quantitative as well as qualitative methods are deployed to examine the written texts holistically and in terms of fluency, complexity, and accuracy. Participants of the study include 4 pairs in each group (n=8). They participated in two experimental conditions, which are: (1) collaborative planning followed by individual writing and individual reviewing and (2) individual planning followed by individual writing and collaborative reviewing. The comparative research findings indicate that while collaborative planning resulted in better overall text quality (precisely better content and organization ratings), better fluency, better complexity, and fewer lexical errors, collaborative reviewing produces better accuracy and less syntactical and mechanical errors. The discussion of the findings suggests the need to conduct more comparative research in order to further explore the effects of collaboration in planning or in reviewing. Pedagogical implications of the current study include advising teachers to choose between implementing collaboration in planning or in reviewing depending on their students’ need and what they need to improve.

Keywords: collaboration, writing, collaborative planning, collaborative reviewing

Procedia PDF Downloads 83
4133 Raising Awareness to Health Professionals about Emotional Needs of Families Suffering Perinatal Loss through a Short Documentary

Authors: Elisenda Camprecios, Alicia Macarrila, Montse Albiol, Neus Garriga Garriga

Abstract:

The loss of a child during pregnancy, or shortly after birth, is not a common occurrence, but it is a prevalent fact in our society. When this loss happens, life and death walk together. The grief that parents experience following a perinatal loss is a devastating experience. Professionals are aware that the quality of care offered during this first period is crucial to support the families experiencing a perinatal loss and meet their needs. However, it is not always easy for the health care professionals to know what to say and what to do in these difficult circumstances. Given the complexity of the Health, painful process that a family must face when is affected by such loss, we believe that the creation of a protocol that pays special attention to the emotional needs of those couples can be a very valuable tool for the professionals. The short documentary named ‘When the illusion vanished’ was created as part of the material of this protocol, which focuses on the emotional needs of the families who have suffered a perinatal loss. This video is designed to see what impact has a perinatal death and to raise awareness among professionals working in this field. The methodology is based on interviews with couples who have experienced perinatal death and to professionals who accompany families suffering from perinatal loss. The use of sensitive and empathized words, being encouraged to express feelings, respect the time, appropriate training for the professionals are some of the issues reflected in this documentary. We believe that this video has contributed to help health care professionals to empathize and understand the need to be able to accompany these families with the appropriate care, respectful, empathetic attitude and professionalism so that they can start the path to a ‘healthy’ mourning.

Keywords: neonatal loss, midwifery, perinatal bereavement, perinatal loss

Procedia PDF Downloads 138
4132 Second-Order Complex Systems: Case Studies of Autonomy and Free Will

Authors: Eric Sanchis

Abstract:

Although there does not exist a definitive consensus on a precise definition of a complex system, it is generally considered that a system is complex by nature. The presented work illustrates a different point of view: a system becomes complex only with regard to the question posed to it, i.e., with regard to the problem which has to be solved. A complex system is a couple (question, object). Because the number of questions posed to a given object can be potentially substantial, complexity does not present a uniform face. Two types of complex systems are clearly identified: first-order complex systems and second-order complex systems. First-order complex systems physically exist. They are well-known because they have been studied by the scientific community for a long time. In second-order complex systems, complexity results from the system composition and its articulation that are partially unknown. For some of these systems, there is no evidence of their existence. Vagueness is the keyword characterizing this kind of systems. Autonomy and free will, two mental productions of the human cognitive system, can be identified as second-order complex systems. A classification based on the properties structure makes it possible to discriminate complex properties from the others and to model this kind of second order complex systems. The final outcome is an implementable synthetic property that distinguishes the solid aspects of the actual property from those that are uncertain.

Keywords: autonomy, free will, synthetic property, vaporous complex systems

Procedia PDF Downloads 187
4131 First-Person Point of View in Contrast to Globalisation in Somerset Maugham’s ‘Mr. Know-All’

Authors: Armel Mbon

Abstract:

This paper discusses the first-person point of view in Maugham's 'Mr. Know-All.' It particularly analyses the narrator's position in relation to the story told in this short story, with the intention of disclosing the latter's prejudice against Mr. Kelada, the protagonist, and, consequently, its hindrance to globalisation. It thus underlines the fact that this protagonist and other travellers are different colours, but one person on this ship epitomises globalisation. The general attitude of readers is that they are inclined to easily believe the narrator while forgetting that fiction is the work of a taler, a teller, but, first and foremost, a liar. The audience, whether it is disconnected from the setting or not, also tends to forget that "travellers from afar can lie with impunity." In fact, the nameless narrator in Maugham's short story has a persona that leaves a lot to be desired. He is prejudiced against Mr. Kelada, known as Mr. Know-All, as will be evidenced by the scrutiny of his diction. This paper finally purports to show that those who proclaim globalisation loudly are not ready to live together.

Keywords: narrator, persona, point of view, diction, contrast, globalisation

Procedia PDF Downloads 73
4130 Factors Influencing the Adoption of Social Media as a Medium of Public Service Broadcasting

Authors: Seyed Mohammadbagher Jafari, Izmeera Shiham, Masoud Arianfar

Abstract:

The increased usage of Social media for different uses in turn makes it important to develop an understanding of users and their attitudes toward these sites, and moreover, the uses of such sites in a broader perspective such as broadcasting. This quantitative study addressed the problem of factors influencing the adoption of social media as a medium of public service broadcasting in the Republic of Maldives. These powerful and increasingly usable tools, accompanied by large public social media datasets, are bringing in a golden age of social science by empowering researchers to measure social behavior on a scale never before possible. This was conducted by exploring social responses on the use of social media. Research model was developed based on the previous models such as TAM, DOI and Trust combined model. It evaluates the influence of perceived ease of use, perceived usefulness, trust, complexity, compatibility and relative advantage influence on the adoption of social Media. The model was tested on a sample of 365 Maldivian people using survey method via questionnaire. The result showed that perceived usefulness, trust, relative advantage and complexity would highly influence the adoption of social media.

Keywords: adoption, broadcasting, maldives, social media

Procedia PDF Downloads 462
4129 The Effects of Consumer Inertia and Emotions on New Technology Acceptance

Authors: Chyi Jaw

Abstract:

Prior literature on innovation diffusion or acceptance has almost exclusively concentrated on consumers’ positive attitudes and behaviors for new products/services. Consumers’ negative attitudes or behaviors to innovations have received relatively little marketing attention, but it happens frequently in practice. This study discusses consumer psychological factors when they try to learn or use new technologies. According to recent research, technological innovation acceptance has been considered as a dynamic or mediated process. This research argues that consumers can experience inertia and emotions in the initial use of new technologies. However, given such consumer psychology, the argument can be made as to whether the inclusion of consumer inertia (routine seeking and cognitive rigidity) and emotions increases the predictive power of new technology acceptance model. As data from the empirical study find, the process is potentially consumer emotion changing (independent of performance benefits) because of technology complexity and consumer inertia, and impact innovative technology use significantly. Finally, the study presents the superior predictability of the hypothesized model, which let managers can better predict and influence the successful diffusion of complex technological innovations.

Keywords: cognitive rigidity, consumer emotions, new technology acceptance, routine seeking, technology complexity

Procedia PDF Downloads 275
4128 Determining the Materiality of an Undisclosed Fact: An Onerous Duty on the Assured

Authors: Adekemi Adebowale

Abstract:

The duty of disclosure in Nigerian insurance law is in need of reform. The materiality of an undisclosed fact (notwithstanding that it was an honest and innocent non-disclosure) currently entitles insurers to avoid insurance policies, leaving an insured with an uncovered loss. While the test of materiality requires an insured to voluntarily disclose facts that will influence an insurer's decision without proper guidelines from the insurer, the insurer is only expected to prove that the undisclosed fact had influenced its judgment in fixing the premium or determining whether to accept the risk. This problem places an onerous duty on the assured to volunteer to the insurer every material fact even though the insured only has a slight idea about the mind of a hypothetical prudent insurer. This paper explores the modern approach to revisiting the problem of an insured’s pre-contractual obligation to determine material facts in Nigerian insurance law. The aim is to build upon the change in the structure of insurance contract obligations in other common law jurisdictions such as the United Kingdom. The doctrinal and comparative methodology captures the burden imposed on the insured under the existing Nigerian insurance law. It finds that the continued application of the law leaves the insured in the weakest position, and he stands to lose in a contract supposedly created for his benefit. It is apparent that if this problem remains unresolved, the over-all consequence will contribute to a significant decline in the insurance contract, which may affect the Nigerian economy. The paper aims to evaluate the risks of the continuous application of the traditional law, which does not keep with the pace of modern insurance practice. It will ultimately produce a legally compliant reform, along with a significant deviation from the archaic structure that exists in the Nigerian insurance law. This paper forms part of an on-going PhD research on "The insured’s pre-contractual duty of utmost of utmost good faith". The outcome from the research to date finds that the insured bears the burden of the obligation to act in utmost good faith where it concerns disclosure of material facts.

Keywords: disclosure, materiality, Nigeria, United Kingdom, utmost good faith

Procedia PDF Downloads 100
4127 Customer Adoption and Attitudes in Mobile Banking in Sri Lanka

Authors: Prasansha Kumari

Abstract:

This paper intends to identify and analyze customer adoption and attitudes towards mobile banking facilities. The study uses six perceived characteristics of innovation that can be used to form a favorable or unfavorable attitude toward an innovation, namely: Relative advantage, compatibility, complexity, trailability, risk, and observability. Collected data were analyzed using Pearson Chi-Square test. The results showed that mobile bank users were predominantly males. There is a growing trend among young, educated customers towards converting to mobile banking in Sri Lanka. The research outcomes suggested that all the six factors are statistically highly significant in influencing mobile banking adoption and attitude formation towards mobile banking in Sri Lanka. The major reasons for adopting mobile banking services are the accessibility and availability of services regardless of time and place. Over the 75 percent of the respondents mentioned that savings in time and effort and low financial costs of conducting mobile banking were advantageous. Issue of security was found to be the most important factor that motivated consumer adoption and attitude formation towards mobile banking. Main barriers to mobile banking were the lack of technological skills, the traditional cash‐carry banking culture, and the lack of awareness and insufficient guidance to using mobile banking.

Keywords: compatibility, complexity, mobile banking, observability, risk

Procedia PDF Downloads 181
4126 Designing Presentational Writing Assessments for the Advanced Placement World Language and Culture Exams

Authors: Mette Pedersen

Abstract:

This paper outlines the criteria that assessment specialists use when they design the 'Persuasive Essay' task for the four Advanced Placement World Language and Culture Exams (AP French, German, Italian, and Spanish). The 'Persuasive Essay' is a free-response, source-based, standardized measure of presentational writing. Each 'Persuasive Essay' item consists of three sources (an article, a chart, and an audio) and a prompt, which is a statement of the topic phrased as an interrogative sentence. Due to its richness of source materials and due to the amount of time that test takers are given to prepare for and write their responses (a total of 55 minutes), the 'Persuasive Essay' is the free-response task on the AP World Language and Culture Exams that goes to the greatest lengths to unleash the test takers' proficiency potential. The author focuses on the work that goes into designing the 'Persuasive Essay' task, outlining best practices for the selection of topics and sources, the interplay that needs to be present among the sources and the thinking behind the articulation of prompts for the 'Persuasive Essay' task. Using released 'Persuasive Essay' items from the AP World Language and Culture Exams and accompanying data on test taker performance, the author shows how different passages, and features of passages, have succeeded (and sometimes not succeeded) in eliciting writing proficiency among test takers over time. Data from approximately 215.000 test takers per year from 2014 to 2017 and approximately 35.000 test takers per year from 2012 to 2013 form the basis of this analysis. The conclusion of the study is that test taker performance improves significantly when the sources that test takers are presented with express directly opposing viewpoints. Test taker performance also improves when the interrogative prompt that the test takers respond to is phrased as a yes/no question. Finally, an analysis of linguistic difficulty and complexity levels of the printed sources reveals that test taker performance does not decrease when the complexity level of the article of the 'Persuasive Essay' increases. This last text complexity analysis is performed with the help of the 'ETS TextEvaluator' tool and the 'Complexity Scale for Information Texts (Scale)', two tools, which, in combination, provide a rubric and a fully-automated technology for evaluating nonfiction and informational texts in English translation.

Keywords: advanced placement world language and culture exams, designing presentational writing assessments, large-scale standardized assessments of written language proficiency, source-based language testing

Procedia PDF Downloads 121
4125 Component Interface Formalization in Robotic Systems

Authors: Anton Hristozov, Eric Matson, Eric Dietz, Marcus Rogers

Abstract:

Components are heavily used in many software systems, including robotics systems. The growth of sophistication and diversity of new capabilities for robotic systems presents new challenges to their architectures. Their complexity is growing exponentially with the advent of AI, smart sensors, and the complex tasks they have to accomplish. Such complexity requires a more rigorous approach to the creation, use, and interoperability of software components. The issue is exacerbated because robotic systems are becoming more and more reliant on third-party components for certain functions. In order to achieve this kind of interoperability, including dynamic component replacement, we need a way to standardize their interfaces. A formal approach is desperately needed to specify what an interface of a robotic software component should contain. This study performs an analysis of the issue and presents a universal and generic approach to standardizing component interfaces for robotic systems. Our approach is inspired by well-established robotic architectures such as ROS, PX4, and Ardupilot. The study is also applicable to other software systems that share similar characteristics with robotic systems. We consider the use of JSON or Domain Specific Languages (DSL) development with tools such as Antlr and automatic code and configuration file generation for frameworks such as ROS and PX4. A case study with ROS2 is presented as a proof of concept for the proposed methodology.

Keywords: CPS, robots, software architecture, interface, ROS, autopilot

Procedia PDF Downloads 71
4124 A Configurational Approach to Understand the Effect of Organizational Structure on Absorptive Capacity: Results from PLS and fsQCA

Authors: Murad Ali, Anderson Konan Seny Kan, Khalid A. Maimani

Abstract:

Based on the theory of organizational design and the theory of knowledge, this study uses complexity theory to explain and better understand the causal impacts of various patterns of organizational structural factors stimulating absorptive capacity (ACAP). Organizational structure can be thought of as heterogeneous configurations where various components are often intertwined. This study argues that impact of the traditional variables which define a firm’s organizational structure (centralization, formalization, complexity and integration) on ACAP is better understood in terms of set-theoretic relations rather than correlations. This study uses a data sample of 347 from a multiple industrial sector in South Korea. The results from PLS-SEM support all the hypothetical relationships among the variables. However, fsQCA results suggest the possible configurations of centralization, formalization, complexity, integration, age, size, industry and revenue factors that contribute to high level of ACAP. The results from fsQCA demonstrate the usefulness of configurational approaches in helping understand equifinality in the field of knowledge management. A recent fsQCA procedure based on a modeling subsample and holdout subsample is use in this study to assess the predictive validity of the model under investigation. The same type predictive analysis is also made through PLS-SEM. These analyses reveal a good relevance of causal solutions leading to high level of ACAP. In overall, the results obtained from combining PLS-SEM and fsQCA are very insightful. In particular, they could help managers to link internal organizational structural with ACAP. In other words, managers may comprehend finely how different components of organizational structure can increase the level of ACAP. The configurational approach may trigger new insights that could help managers prioritize selection criteria and understand the interactions between organizational structure and ACAP. The paper also discusses theoretical and managerial implications arising from these findings.

Keywords: absorptive capacity, organizational structure, PLS-SEM, fsQCA, predictive analysis, modeling subsample, holdout subsample

Procedia PDF Downloads 315
4123 Simulation Model for Optimizing Energy in Supply Chain Management

Authors: Nazli Akhlaghinia, Ali Rajabzadeh Ghatari

Abstract:

In today's world, with increasing environmental awareness, firms are facing severe pressure from various stakeholders, including the government and customers, to reduce their harmful effects on the environment. Over the past few decades, the increasing effects of global warming, climate change, waste, and air pollution have increased the global attention of experts to the issue of the green supply chain and led them to the optimal solution for greenery. Green supply chain management (GSCM) plays an important role in motivating the sustainability of the organization. With increasing environmental concerns, the main objective of the research is to use system thinking methodology and Vensim software for designing a dynamic system model for green supply chain and observing behaviors. Using this methodology, we look for the effects of a green supply chain structure on the behavioral dynamics of output variables. We try to simulate the complexity of GSCM in a period of 30 months and observe the complexity of behaviors of variables including sustainability, providing green products, and reducing energy consumption, and consequently reducing sample pollution.

Keywords: supply chain management, green supply chain management, system dynamics, energy consumption

Procedia PDF Downloads 120
4122 Endeavor in Management Process by Executive Dashboards: The Case of the Financial Directorship in Brazilian Navy

Authors: R. S. Quintal, J. L. Tesch Santos, M. D. Davis, E. C. de Santana, M. de F. Bandeira dos Santos

Abstract:

The objective is to identify the contributions from the introduction of the computerized system deal within the Accounting Department of Brazilian Navy Financial Directorship and its possible effects on the budgetary and financial harvest of Brazilian Navy. The relevance lies in the fact that the management process is responsible for the continuous improvement of organizational performance through higher levels of quality in their activities. Improvements in organizational processes have direct effects on crops cost, quality, reliability, flexibility and speed. The method of study of this research is the case study. The choice of case study attended, among other demands, a need for greater flexibility to study processes related to a computerized system. The sources of evidence were used literature, documentary and direct observation. Direct observation was made by monitoring the implementation of the computerized system in the Division of Management Analysis. The main findings of the study point to the fact that the computerized system may contribute significantly to the standardization of information. There was improvement of internal processes in the division of management analysis, made possible the consolidation of a standard management and performance analysis that contribute to global homogeneity in the treatment of information essential to the process of decision making. This study has limitations related to the fact the search result be subject exclusively to the case studied, and it is impossible to generalize to other organs of government.

Keywords: process management, management control, business intelligence, Brazilian Navy

Procedia PDF Downloads 215
4121 Impact of Unbalanced Urban Structure on the Traffic Congestion in Biskra, Algeria

Authors: Khaled Selatnia

Abstract:

Nowadays, the traffic congestion becomes increasingly a chronic problem. Sometimes, the cause is attributed to the recurrent road works that create barriers to the efficient movement. But congestion, which usually occurs in cities, can take diverse forms and magnitudes. The case study of Biskra city in Algeria and the diagnosis of its road network show that throughout all the micro regional system, the road network seems at first quite dense. However, this density although it is important, does not cover all areas. A major flow is concentrated in the axis Sidi Okba – Biskra – Tolga. The largest movement of people in the Wilaya (prefecture) revolves around these three centers and their areas of influence. Centers farthest from the trio are very poorly served. This fact leads us to ask questions about the extent of congestion in Biskra city and its relationship to the imbalance of the urban framework. The objective of this paper is to highlight the impact of the urban fact on the traffic congestion.

Keywords: congestion, urban framework, regional, urban and regional studies

Procedia PDF Downloads 606
4120 The Physically Handicapped in the City

Authors: Bekhemmas Youcef

Abstract:

The category of the disabled, like other social groups, is considered to have been affected by fate with a disability that led to a reduction in the fulfillment of its social roles to the fullest extent or led to its complete abandonment. Psychological, and until we understand its behavioral methods that express a lot of this complexity and intertwining, and despite all that, this category has not yet received the appropriate great interest from specialized researchers, and even officials, and it is natural that the category of people with disabilities has psychological and social requirements in order to regains their capabilities or some From her, it also needs to prepare the environment in which she lives in order to integrate into society As the motor disability is one of the most common types of disability in the world, and it is constantly increasing, considering the increase in the causes leading to it, such as the traffic accident, and the motor disability often affects individuals from a psychological point of view, but it also affects their social surroundings, whether close or extended, and thus it draws limits and quality For their way of life, as well as determining roles for them as actors of a special kind within their societies. The methodology is similar to the organizational framework for the production of any scientific knowledge and based on the fact that sociology is a project that aims to understand and interpret the social reality scientifically and through the nature of the subject studied in the framework of the reality of the disabled in the city and in order to get closer to the daily life of the physically disabled within the urban center, we adopted the qualitative approach A choice that complies with the spirit of Viberian sociology, especially since Max Weber insists on the need to search for the meaning that the social actor gives to his behavior. Through the results reached in this study, it was found that the city still suffers from several deficiencies at the level of equipment and urban planning in a way that keeps pace with the number of people with disabilities in the city.

Keywords: physically, handicapped, in, the city

Procedia PDF Downloads 55
4119 Scalable Systolic Multiplier over Binary Extension Fields Based on Two-Level Karatsuba Decomposition

Authors: Chiou-Yng Lee, Wen-Yo Lee, Chieh-Tsai Wu, Cheng-Chen Yang

Abstract:

Shifted polynomial basis (SPB) is a variation of polynomial basis representation. SPB has potential for efficient bit-level and digit-level implementations of multiplication over binary extension fields with subquadratic space complexity. For efficient implementation of pairing computation with large finite fields, this paper presents a new SPB multiplication algorithm based on Karatsuba schemes, and used that to derive a novel scalable multiplier architecture. Analytical results show that the proposed multiplier provides a trade-off between space and time complexities. Our proposed multiplier is modular, regular, and suitable for very-large-scale integration (VLSI) implementations. It involves less area complexity compared to the multipliers based on traditional decomposition methods. It is therefore, more suitable for efficient hardware implementation of pairing based cryptography and elliptic curve cryptography (ECC) in constraint driven applications.

Keywords: digit-serial systolic multiplier, elliptic curve cryptography (ECC), Karatsuba algorithm (KA), shifted polynomial basis (SPB), pairing computation

Procedia PDF Downloads 343
4118 Study of Expatriation as Countermeasure to Citizenship-Based Taxation

Authors: Gabriele Palumbo

Abstract:

This research empirically examines some of the reasons behind the fact that recently the number of people giving up their American citizenship for tax purposes has recently increased drastically. The United States Jurisdiction represents a unicum in the practice of taxing worldwide income not only to residents of the United States but also to U.S. citizens living abroad. The worldwide income taxation also affects people defined as “Accidental Americans” who are unaware that they are U.S. citizens. Those people are considered Americans even though they have not been to the United States. American residents abroad can rely on United States income tax treaties and some national law provisions, such as the exclusion of foreign income and foreign tax credits, which are designed specifically to avoid double taxation. However, this mechanism may prove unsatisfactory for people who have not been linked anymore or individuals who have never had relations with the United States. U.S. citizens who are determined to cut all of the ties between themselves and the United States, especially those that involve tax implications, can renounce their U.S. citizenship with the expatriation procedure. The expatriation process represents the extrema ratio and implicates several steps which must be followed carefully. This paper shows the complexity of the procedure that a U.S. citizen who is resident in a foreign country would have to follow to relinquish U.S. citizenship for tax purposes. The mechanism is intended to discourage people from renounce. Going beyond the question of whether U.S. tax regulation is fair or not, this principle nowadays characterizes a popular topic that many scholars and lawyers are discussing. The outcome provides interesting implications that could induce the Congress to rethink the definition of citizenship for both fiscal and nationality law purposes. Indeed, even if a sort of checks and balances has the task of mitigating the renunciation of U.S. citizenship, more and more U.S. citizens desire to get rid of their citizenship.

Keywords: double taxation, expatriation tax, international taxation, relinquishment of United States citizenship

Procedia PDF Downloads 86
4117 Not Three Gods but One: Why Reductionism Does Not Serve Our Theological Discourse

Authors: Finley Lawson

Abstract:

The triune nature of God is one of the most complex doctrines of Christianity, and its complexity is further compounded when one considers the incarnation. However, many of the difficulties and paradoxes associated with our idea of the divine arise from our adherence to reductionist ontology. In order to move our theological discourse forward, in respect to divine and human nature, a holistic interpretation of our profession of faith is necessary. The challenge of a holistic interpretation is that it questions our ability to make any statement about the genuine, ontological individuation of persons (both divine and human), and in doing so raises the issue of whether we are, ontologically, bound to descend in to a form of pan(en)theism. In order to address the ‘inevitable’ slide in to pan(en)theism. The impact of two forms of holistic interpretation, Boolean and Non-Boolean, on our concept of personhood will be examined. Whilst a Boolean interpretation allows for a greater understanding of the relational nature of the Trinity, it is the Non-Boolean interpretation which has greater ontological significance. A Non-Boolean ontology, grounded in our scientific understanding of the nature of the world, shows our quest for individuation rests not in ontological fact but in epistemic need, and that it is our limited epistemology that drives our need to divide that which is ontologically indivisible. This discussion takes place within a ‘methodological’, rather than ‘doctrinal’ approach to science and religion - examining assumptions and methods that have shaped our language and beliefs about key doctrines, rather than seeking to reconcile particular Christian doctrines with particular scientific theories. Concluding that Non-Boolean holism is the more significant for our doctrine is, in itself, not enough. A world without division appears much removed from the distinct place of man and divine as espoused in our creedal affirmation, to this end, several possible interpretations for understanding Non-Boolean human – divine relations are tentatively put forward for consideration.

Keywords: holism, individuation, ontology, Trinitarian relations

Procedia PDF Downloads 232
4116 Hardware Implementation of Local Binary Pattern Based Two-Bit Transform Motion Estimation

Authors: Seda Yavuz, Anıl Çelebi, Aysun Taşyapı Çelebi, Oğuzhan Urhan

Abstract:

Nowadays, demand for using real-time video transmission capable devices is ever-increasing. So, high resolution videos have made efficient video compression techniques an essential component for capturing and transmitting video data. Motion estimation has a critical role in encoding raw video. Hence, various motion estimation methods are introduced to efficiently compress the video. Low bit‑depth representation based motion estimation methods facilitate computation of matching criteria and thus, provide small hardware footprint. In this paper, a hardware implementation of a two-bit transformation based low-complexity motion estimation method using local binary pattern approach is proposed. Image frames are represented in two-bit depth instead of full-depth by making use of the local binary pattern as a binarization approach and the binarization part of the hardware architecture is explained in detail. Experimental results demonstrate the difference between the proposed hardware architecture and the architectures of well-known low-complexity motion estimation methods in terms of important aspects such as resource utilization, energy and power consumption.

Keywords: binarization, hardware architecture, local binary pattern, motion estimation, two-bit transform

Procedia PDF Downloads 286
4115 Healthcare Big Data Analytics Using Hadoop

Authors: Chellammal Surianarayanan

Abstract:

Healthcare industry is generating large amounts of data driven by various needs such as record keeping, physician’s prescription, medical imaging, sensor data, Electronic Patient Record(EPR), laboratory, pharmacy, etc. Healthcare data is so big and complex that they cannot be managed by conventional hardware and software. The complexity of healthcare big data arises from large volume of data, the velocity with which the data is accumulated and different varieties such as structured, semi-structured and unstructured nature of data. Despite the complexity of big data, if the trends and patterns that exist within the big data are uncovered and analyzed, higher quality healthcare at lower cost can be provided. Hadoop is an open source software framework for distributed processing of large data sets across clusters of commodity hardware using a simple programming model. The core components of Hadoop include Hadoop Distributed File System which offers way to store large amount of data across multiple machines and MapReduce which offers way to process large data sets with a parallel, distributed algorithm on a cluster. Hadoop ecosystem also includes various other tools such as Hive (a SQL-like query language), Pig (a higher level query language for MapReduce), Hbase(a columnar data store), etc. In this paper an analysis has been done as how healthcare big data can be processed and analyzed using Hadoop ecosystem.

Keywords: big data analytics, Hadoop, healthcare data, towards quality healthcare

Procedia PDF Downloads 387
4114 Pricing European Options under Jump Diffusion Models with Fast L-stable Padé Scheme

Authors: Salah Alrabeei, Mohammad Yousuf

Abstract:

The goal of option pricing theory is to help the investors to manage their money, enhance returns and control their financial future by theoretically valuing their options. Modeling option pricing by Black-School models with jumps guarantees to consider the market movement. However, only numerical methods can solve this model. Furthermore, not all the numerical methods are efficient to solve these models because they have nonsmoothing payoffs or discontinuous derivatives at the exercise price. In this paper, the exponential time differencing (ETD) method is applied for solving partial integrodifferential equations arising in pricing European options under Merton’s and Kou’s jump-diffusion models. Fast Fourier Transform (FFT) algorithm is used as a matrix-vector multiplication solver, which reduces the complexity from O(M2) into O(M logM). A partial fraction form of Pad`e schemes is used to overcome the complexity of inverting polynomial of matrices. These two tools guarantee to get efficient and accurate numerical solutions. We construct a parallel and easy to implement a version of the numerical scheme. Numerical experiments are given to show how fast and accurate is our scheme.

Keywords: Integral differential equations, , L-stable methods, pricing European options, Jump–diffusion model

Procedia PDF Downloads 132
4113 Investigating a Modern Accident Analysis Model for Textile Building Fires through Numerical Reconstruction

Authors: Mohsin Ali Shaikh, Weiguo Song, Rehmat Karim, Muhammad Kashan Surahio, Muhammad Usman Shahid

Abstract:

Fire investigations face challenges due to the complexity of fire development, and real-world accidents lack repeatability, making it difficult to apply standardized approaches. The unpredictable nature of fires and the unique conditions of each incident contribute to the complexity, requiring innovative methods and tools for effective analysis and reconstruction. This study proposes to provide the modern accident analysis model through numerical reconstruction for fire investigation in textile buildings. This method employs computer simulation to enhance the overall effectiveness of textile-building investigations. The materials and evidence collected from past incidents reconstruct fire occurrences, progressions, and catastrophic processes. The approach is demonstrated through a case study involving a tragic textile factory fire in Karachi, Pakistan, which claimed 257 lives. The reconstruction method proves invaluable for determining fire origins, assessing losses, establishing accountability, and, significantly, providing preventive insights for complex fire incidents.

Keywords: fire investigation, numerical simulation, fire safety, fire incident, textile building

Procedia PDF Downloads 49
4112 The Material-Process Perspective: Design and Engineering

Authors: Lars Andersen

Abstract:

The development of design and engineering in large construction projects are characterized by an increased degree of flattening out of formal structures, extended use of parallel and integrated processes (‘Integrated Concurrent Engineering’) and an increased number of expert disciplines. The integration process is based on ongoing collaborations, dialogues, intercommunication and comments on each other’s work (iterations). This process based on reciprocal communication between actors and disciplines triggers value creation. However, communication between equals is not in itself sufficient to create effective decision making. The complexity of the process and time pressure contribute to an increased risk of a deficit of decisions and loss of process control. The paper refers to a study that aims at developing a resilient decision-making system that does not come in conflict with communication processes based on equality between the disciplines in the process. The study includes the construction of a hospital, following the phases design, engineering and physical building. The Research method is a combination of formative process research, process tracking and phenomenological analyses. The study tracked challenges and problems in the building process to the projection substrates (drawing and models) and further to the organization of the engineering and design phase. A comparative analysis of traditional and new ways of organizing the projecting made it possible to uncover an implicit material order or structure in the process. This uncovering implied a development of a material process perspective. According to this perspective the complexity of the process is rooted in material-functional differentiation. This differentiation presupposes a structuring material (the skeleton of the building) that coordinates the other types of material. Each expert discipline´s competence is related to one or a set of materials. The architect, consulting engineer construction etc. have their competencies related to structuring material, and inherent in this; coordination competence. When dialogues between the disciplines concerning the coordination between them do not result in agreement, the disciplines with responsibility for the structuring material decide the interface issues. Based on these premises, this paper develops a self-organized expert-driven interdisciplinary decision-making system.

Keywords: collaboration, complexity, design, engineering, materiality

Procedia PDF Downloads 202
4111 Enhancing Disaster Response Capabilities in Asia-Pacific: An Explorative Study Applied to Decision Support Tools for Logistics Network Design

Authors: Giuseppe Timperio, Robert de Souza

Abstract:

Logistics operations in the context of disaster response are characterized by a high degree of complexity due to the combined effect of a large number of stakeholders involved, time pressure, uncertainties at various levels, massive deployment of goods and personnel, and gigantic financial flow to be managed. It also involves several autonomous parties such as government agencies, militaries, NGOs, UN agencies, private sector to name few, to have a highly collaborative approach especially in the critical phase of the immediate response. This is particularly true in the context of L3 emergencies that are the most severe, large-scale humanitarian crises. Decision-making processes in disaster management are thus extremely difficult due to the presence of multiple decision-makers involved, and the complexity of the tasks being tackled. Hence, in this paper, we look at applying ICT based solutions to enable a speedy and effective decision making in the golden window of humanitarian operations. A high-level view of ICT based solutions in the context of logistics operations for humanitarian response in Southeast Asia is presented, and their viability in a real-life case about logistics network design is explored.

Keywords: decision support, disaster preparedness, humanitarian logistics, network design

Procedia PDF Downloads 153
4110 A Mixed Integer Programming Model for Optimizing the Layout of an Emergency Department

Authors: Farhood Rismanchian, Seong Hyeon Park, Young Hoon Lee

Abstract:

During the recent years, demand for healthcare services has dramatically increased. As the demand for healthcare services increases, so does the necessity of constructing new healthcare buildings and redesigning and renovating existing ones. Increasing demands necessitate the use of optimization techniques to improve the overall service efficiency in healthcare settings. However, high complexity of care processes remains the major challenge to accomplish this goal. This study proposes a method based on process mining results to address the high complexity of care processes and to find the optimal layout of the various medical centers in an emergency department. ProM framework is used to discover clinical pathway patterns and relationship between activities. Sequence clustering plug-in is used to remove infrequent events and to derive the process model in the form of Markov chain. The process mining results served as an input for the next phase which consists of the development of the optimization model. Comparison of the current ED design with the one obtained from the proposed method indicated that a carefully designed layout can significantly decrease the distances that patients must travel.

Keywords: Mixed Integer programming, Facility layout problem, Process Mining, Healthcare Operation Management

Procedia PDF Downloads 324
4109 TALENT GAMING©: The Innovative Methodology to Explore Talents and Empower Teams by Using Board Games

Authors: Susana F. Casla

Abstract:

Talent Gaming is an innovative methodology based on a large research done for years about how table board games can be used to empower teams. This methodology was developed thinking about the efficiency of facilitating team coaching sessions and the importance of bringing out the best of individuals when working as a team. The fact that more senses are involved in playing a board game, linked with the psychological element of space and “permission to play”, help us travel to earlier stages of our life when our authenticity was at its heights. By being focused on playing the board game, the individual does not direct their consciousness in a particular way and is rather focused in winning the board game. By doing this, his or her inner talents and authenticity surfaces and the fact that all the senses are involved impacts enormously his behaviors and attitudes. All of this combined results in an arena where our talents show up and our decision making process is not impacted by other elements, such as appearances, status or hierarchy.

Keywords: talent, team, board game, business psychology, coaching teams at work

Procedia PDF Downloads 356
4108 The Effects of Three Levels of Contextual Inference among adult Athletes

Authors: Abdulaziz Almustafa

Abstract:

Considering the critical role permanence has on predictions related to the contextual interference effect on laboratory and field research, this study sought to determine whether the paradigm of the effect depends on the complexity of the skill during the acquisition and transfer phases. The purpose of the present study was to investigate the effects of contextual interference CI by extending previous laboratory and field research with adult athletes through the acquisition and transfer phases. Male (n=60) athletes age 18-22 years-old, were chosen randomly from Eastern Province Clubs. They were assigned to complete blocked, random, or serial practices. Analysis of variance with repeated measures MANOVA indicated that, the results did not support the notion of CI. There were no significant differences in acquisition phase between blocked, serial and random practice groups. During the transfer phase, there were no major differences between the practice groups. Apparently, due to the task complexity, participants were probably confused and not able to use the advantages of contextual interference. This is another contradictory result to contextual interference effects in acquisition and transfer phases in sport settings. One major factor that can influence the effect of contextual interference is task characteristics as the nature of level of difficulty in sport-related skill.

Keywords: contextual interference, acquisition, transfer, task difficulty

Procedia PDF Downloads 448
4107 Characterising Stable Model by Extended Labelled Dependency Graph

Authors: Asraful Islam

Abstract:

Extended dependency graph (EDG) is a state-of-the-art isomorphic graph to represent normal logic programs (NLPs) that can characterize the consistency of NLPs by graph analysis. To construct the vertices and arcs of an EDG, additional renaming atoms and rules besides those the given program provides are used, resulting in higher space complexity compared to the corresponding traditional dependency graph (TDG). In this article, we propose an extended labeled dependency graph (ELDG) to represent an NLP that shares an equal number of nodes and arcs with TDG and prove that it is isomorphic to the domain program. The number of nodes and arcs used in the underlying dependency graphs are formulated to compare the space complexity. Results show that ELDG uses less memory to store nodes, arcs, and cycles compared to EDG. To exhibit the desirability of ELDG, firstly, the stable models of the kernel form of NLP are characterized by the admissible coloring of ELDG; secondly, a relation of the stable models of a kernel program with the handles of the minimal, odd cycles appearing in the corresponding ELDG has been established; thirdly, to our best knowledge, for the first time an inverse transformation from a dependency graph to the representing NLP w.r.t. ELDG has been defined that enables transferring analytical results from the graph to the program straightforwardly.

Keywords: normal logic program, isomorphism of graph, extended labelled dependency graph, inverse graph transforma-tion, graph colouring

Procedia PDF Downloads 195
4106 Improving Student Programming Skills in Introductory Computer and Data Science Courses Using Generative AI

Authors: Genady Grabarnik, Serge Yaskolko

Abstract:

Generative Artificial Intelligence (AI) has significantly expanded its applicability with the incorporation of Large Language Models (LLMs) and become a technology with promise to automate some areas that were very difficult to automate before. The paper describes the introduction of generative Artificial Intelligence into Introductory Computer and Data Science courses and analysis of effect of such introduction. The generative Artificial Intelligence is incorporated in the educational process two-fold: For the instructors, we create templates of prompts for generation of tasks, and grading of the students work, including feedback on the submitted assignments. For the students, we introduce them to basic prompt engineering, which in turn will be used for generation of test cases based on description of the problems, generating code snippets for the single block complexity programming, and partitioning into such blocks of an average size complexity programming. The above-mentioned classes are run using Large Language Models, and feedback from instructors and students and courses’ outcomes are collected. The analysis shows statistically significant positive effect and preference of both stakeholders.

Keywords: introductory computer and data science education, generative AI, large language models, application of LLMS to computer and data science education

Procedia PDF Downloads 44