Search results for: singleton review spam detection
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7703

Search results for: singleton review spam detection

5393 The Flipped Education Case Study on Teacher Professional Learning Community in Technology and Media Implementation

Authors: Juei-Hsin Wang, Yen-Ting Chen

Abstract:

The paper examines teacher professional learning community theory and implementation by using technology and media tools in Taiwan. After literature review, the researcher concluded in five elements of teacher professional learning community theory. They are ‘sharing the vision and value', ‘collaborative cooperation’, ‘ to support the situation', ‘to share practice' and 'Pay Attention to Student Learning Effectiveness' five levels by using technology and media in flipped education. Teacher professional learning community is one kind of models for teacher professional development in flipped education. Due to Taiwan education culture, there is no summative evaluation for teachers. So, there are multiple kinds of ways and education practice in teacher professional learning community nowadays. This study used literature review and quality analysis to analyze the connection theory and practice and discussed the official and non‐official strategies on teacher professional learning community by using technology and media in flipped education. The tablet is used as a camera tool for classroom students to solve problems. The students can instantly see and enable other students to watch the whole class discussion by operating the tablet. This would allow teachers and students to focus on discussing the connotation of subjects, especially bottom‐up and non‐official cases from teachers become an important influence in Taiwan.

Keywords: professional learning community, collaborative cooperation, flipped education, technology application, media application

Procedia PDF Downloads 133
5392 Relationship between Learning Methods and Learning Outcomes: Focusing on Discussions in Learning

Authors: Jaeseo Lim, Jooyong Park

Abstract:

Although there is ample evidence that student involvement enhances learning, college education is still mainly centered on lectures. However, in recent years, the effectiveness of discussions and the use of collective intelligence have attracted considerable attention. This study intends to examine the empirical effects of discussions on learning outcomes in various conditions. Eighty eight college students participated in the study and were randomly assigned to three groups. Group 1 was told to review material after a lecture, as in a traditional lecture-centered class. Students were given time to review the material for themselves after watching the lecture in a video clip. Group 2 participated in a discussion in groups of three or four after watching the lecture. Group 3 participated in a discussion after studying on their own. Unlike the previous two groups, students in Group 3 did not watch the lecture. The participants in the three groups were tested after studying. The test questions consisted of memorization problems, comprehension problems, and application problems. The results showed that the groups where students participated in discussions had significantly higher test scores. Moreover, the group where students studied on their own did better than that where students watched a lecture. Thus discussions are shown to be effective for enhancing learning. In particular, discussions seem to play a role in preparing students to solve application problems. This is a preliminary study and other age groups and various academic subjects need to be examined in order to generalize these findings. We also plan to investigate what kind of support is needed to facilitate discussions.

Keywords: discussions, education, learning, lecture, test

Procedia PDF Downloads 167
5391 Groundwater Contamination Assessment and Mitigation Strategies for Water Resource Sustainability: A Concise Review

Authors: Khawar Naeem, Adel Elomri, Adel Zghibi

Abstract:

Contamination leakage from municipal solid waste (MSW) landfills is a serious environmental challenge that poses a threat to interconnected ecosystems. It not only contaminates the soil of the saturated zone, but it also percolates down the earth and contaminates the groundwater (GW). In this concise literature review, an effort is made to understand the environmental hazards posed by this contamination to the soil and groundwater, the type of contamination, and possible solutions proposed in the literature. In the study’s second phase, the MSW management practices are explored as the landfill site dump rate and type of MSW into the landfill site directly depend on the MSW management strategies. Case studies from multiple developed and underdeveloped countries are presented, and the complex MSW management system is investigated from an operational perspective to minimize the contamination of GW. One of the significant tools used in the literature was found to be Systems Dynamic Modeling (SDM), which is a simulation-based approach to study the stakeholder’s approach. By employing the SDM approach, the risk of GW contamination can be reduced by devising effective MSW management policies, ultimately resulting in water resource sustainability and regional sustainable development.

Keywords: groundwater contamination, environmental risk, municipal solid waste management, system dynamic modeling, water resource sustainability, sustainable development

Procedia PDF Downloads 55
5390 Use of Galileo Advanced Features in Maritime Domain

Authors: Olivier Chaigneau, Damianos Oikonomidis, Marie-Cecile Delmas

Abstract:

GAMBAS (Galileo Advanced features for the Maritime domain: Breakthrough Applications for Safety and security) is a project funded by the European Space Program Agency (EUSPA) aiming at identifying the search-and-rescue and ship security alert system needs for maritime users (including operators and fishing stakeholders) and developing operational concepts to answer these needs. The general objective of the GAMBAS project is to support the deployment of Galileo exclusive features in the maritime domain in order to improve safety and security at sea, detection of illegal activities and associated surveillance means, resilience to natural and human-induced emergency situations, and develop, integrate, demonstrate, standardize and disseminate these new associated capabilities. The project aims to demonstrate: improvement of the SAR (Search And Rescue) and SSAS (Ship Security Alert System) detection and response to maritime distress through the integration of new features into the beacon for SSAS in terms of cost optimization, user-friendly aspects, integration of Galileo and OS NMA (Open Service Navigation Message Authentication) reception for improved authenticated localization performance and reliability, and at sea triggering capabilities, optimization of the responsiveness of RCCs (Rescue Co-ordination Centre) towards the distress situations affecting vessels, the adaptation of the MCCs (Mission Control Center) and MEOLUT (Medium Earth Orbit Local User Terminal) to the data distribution of SSAS alerts.

Keywords: Galileo new advanced features, maritime, safety, security

Procedia PDF Downloads 82
5389 Rapid Building Detection in Population-Dense Regions with Overfitted Machine Learning Models

Authors: V. Mantey, N. Findlay, I. Maddox

Abstract:

The quality and quantity of global satellite data have been increasing exponentially in recent years as spaceborne systems become more affordable and the sensors themselves become more sophisticated. This is a valuable resource for many applications, including disaster management and relief. However, while more information can be valuable, the volume of data available is impossible to manually examine. Therefore, the question becomes how to extract as much information as possible from the data with limited manpower. Buildings are a key feature of interest in satellite imagery with applications including telecommunications, population models, and disaster relief. Machine learning tools are fast becoming one of the key resources to solve this problem, and models have been developed to detect buildings in optical satellite imagery. However, by and large, most models focus on affluent regions where buildings are generally larger and constructed further apart. This work is focused on the more difficult problem of detection in populated regions. The primary challenge with detecting small buildings in densely populated regions is both the spatial and spectral resolution of the optical sensor. Densely packed buildings with similar construction materials will be difficult to separate due to a similarity in color and because the physical separation between structures is either non-existent or smaller than the spatial resolution. This study finds that training models until they are overfitting the input sample can perform better in these areas than a more robust, generalized model. An overfitted model takes less time to fine-tune from a generalized pre-trained model and requires fewer input data. The model developed for this study has also been fine-tuned using existing, open-source, building vector datasets. This is particularly valuable in the context of disaster relief, where information is required in a very short time span. Leveraging existing datasets means that little to no manpower or time is required to collect data in the region of interest. The training period itself is also shorter for smaller datasets. Requiring less data means that only a few quality areas are necessary, and so any weaknesses or underpopulated regions in the data can be skipped over in favor of areas with higher quality vectors. In this study, a landcover classification model was developed in conjunction with the building detection tool to provide a secondary source to quality check the detected buildings. This has greatly reduced the false positive rate. The proposed methodologies have been implemented and integrated into a configurable production environment and have been employed for a number of large-scale commercial projects, including continent-wide DEM production, where the extracted building footprints are being used to enhance digital elevation models. Overfitted machine learning models are often considered too specific to have any predictive capacity. However, this study demonstrates that, in cases where input data is scarce, overfitted models can be judiciously applied to solve time-sensitive problems.

Keywords: building detection, disaster relief, mask-RCNN, satellite mapping

Procedia PDF Downloads 159
5388 A Comprehensive Characterization of Cell-free RNA in Spent Blastocyst Medium and Quality Prediction for Blastocyst

Authors: Huajuan Shi

Abstract:

Background: The biopsy of the preimplantation embryo may increase the potential risk and concern of embryo viability. Clinically discarded spent embryo medium (SEM) has entered the view of researchers, sparking an interest in noninvasive embryo screening. However, one of the major restrictions is the extremelty low quantity of cf-RNA, which is difficult to efficiently and unbiased amplify cf-RNA using traditional methods. Hence, there is urgently need to an efficient and low bias amplification method which can comprehensively and accurately obtain cf-RNA information to truly reveal the state of SEM cf-RNA. Result: In this present study, we established an agarose PCR amplification system, and has significantly improved the amplification sensitivity and efficiency by ~90 fold and 9.29 %, respectively. We applied agarose to sequencing library preparation (named AG-seq) to quantify and characterize cf-RNA in SEM. The number of detected cf-RNAs (3533 vs 598) and coverage of 3' end were significantly increased, and the noise of low abundance gene detection was reduced. The increasing percentage 5' end adenine and alternative splicing (AS) events of short fragments (< 400 bp) were discovered by AG-seq. Further, the profiles and characterizations of cf-RNA in spent cleavage medium (SCM) and spent blastocyst medium (SBM) indicated that 4‐mer end motifs of cf-RNA fragments could remarkably differentiate different embryo development stages. Significance: This study established an efficient and low-cost SEM amplification and library preparation method. Not only that, we successfully described the characterizations of SEM cf-RNA of preimplantation embryo by using AG-seq, including abundance features fragment lengths. AG-seq facilitates the study of cf-RNA as a noninvasive embryo screening biomarker and opens up potential clinical utilities of trace samples.

Keywords: cell-free RNA, agarose, spent embryo medium, RNA sequencing, non-invasive detection

Procedia PDF Downloads 48
5387 Error Detection and Correction for Onboard Satellite Computers Using Hamming Code

Authors: Rafsan Al Mamun, Md. Motaharul Islam, Rabana Tajrin, Nabiha Noor, Shafinaz Qader

Abstract:

In an attempt to enrich the lives of billions of people by providing proper information, security and a way of communicating with others, the need for efficient and improved satellites is constantly growing. Thus, there is an increasing demand for better error detection and correction (EDAC) schemes, which are capable of protecting the data onboard the satellites. The paper is aimed towards detecting and correcting such errors using a special algorithm called the Hamming Code, which uses the concept of parity and parity bits to prevent single-bit errors onboard a satellite in Low Earth Orbit. This paper focuses on the study of Low Earth Orbit satellites and the process of generating the Hamming Code matrix to be used for EDAC using computer programs. The most effective version of Hamming Code generated was the Hamming (16, 11, 4) version using MATLAB, and the paper compares this particular scheme with other EDAC mechanisms, including other versions of Hamming Codes and Cyclic Redundancy Check (CRC), and the limitations of this scheme. This particular version of the Hamming Code guarantees single-bit error corrections as well as double-bit error detections. Furthermore, this version of Hamming Code has proved to be fast with a checking time of 5.669 nanoseconds, that has a relatively higher code rate and lower bit overhead compared to the other versions and can detect a greater percentage of errors per length of code than other EDAC schemes with similar capabilities. In conclusion, with the proper implementation of the system, it is quite possible to ensure a relatively uncorrupted satellite storage system.

Keywords: bit-flips, Hamming code, low earth orbit, parity bits, satellite, single error upset

Procedia PDF Downloads 115
5386 AI Ethical Values as Dependent on the Role and Perspective of the Ethical AI Code Founder- A Mapping Review

Authors: Moshe Davidian, Shlomo Mark, Yotam Lurie

Abstract:

With the rapid development of technology and the concomitant growth in the capability of Artificial Intelligence (AI) systems and their power, the ethical challenges involved in these systems are also evolving and increasing. In recent years, various organizations, including governments, international institutions, professional societies, civic organizations, and commercial companies, have been choosing to address these various challenges by publishing ethical codes for AI systems. However, despite the apparent agreement that AI should be “ethical,” there is debate about the definition of “ethical artificial intelligence.” This study investigates the various AI ethical codes and their key ethical values. From the vast collection of codes that exist, it analyzes and compares 25 ethical codes that were found to be representative of different types of organizations. In addition, as part of its literature review, the study overviews data collected in three recent reviews of AI codes. The results of the analyses demonstrate a convergence around seven key ethical values. However, the key finding is that the different AI ethical codes eventually reflect the type of organization that designed the code; i.e., the organizations’ role as regulator, user, or developer affects the view of what ethical AI is. The results show a relationship between the organization’s role and the dominant values in its code. The main contribution of this study is the development of a list of the key values for all AI systems and specific values that need to impact the development and design of AI systems, but also allowing for differences according to the organization for which the system is being developed. This will allow an analysis of AI values in relation to stakeholders.

Keywords: artificial intelligence, ethical codes, principles, values

Procedia PDF Downloads 87
5385 Drowning: An Emergency Department Guideline

Authors: Thomas P. Jones

Abstract:

Overview: Drowning is an important cause of accidental death, particularly in children and young people. Although many survive drowning incidents, it is a relatively rare presenting complaint in Emergency Departments. When cases do present, they can be complex and unpredictable. For patients to receive the best care, it is important that their management is standardized and evidence based, however this can be difficult in a topic area with limited studies and inconsistencies in case reporting. Objectives: To review recent cases to assess the performance of Manchester Royal Infirmary Emergency Department in the management of near drowning. To produce evidence based guideline on the management of drowning victims in the ED. Methods: Emergency department records were searched for patients with the diagnosis of ‘fatal drowning’ or ‘nearly drowning’ and two relevant case notes reviewed. To produce the guideline a literature review was conducted and a series of structured short cut systematic reviews known as Best BETs carried out. This information was used to produce a clear treatment pathway. Results: The case studies emphasized the variety in presentation of drowning victims whilst highlighting inconsistencies in management and documentation. An evidence-based guideline is presented as a flowchart, which illustrates the relevant investigations and treatment that victims of a drowning incident should receive, based on the best available evidence. Conclusion: It is hoped that when put into practice, the guideline will improve and standardize patient care in cases of near drowning. An audit is recommended to assess its effectiveness.

Keywords: drowning, near drowning, non fatal drowning, fatal drowning

Procedia PDF Downloads 192
5384 A Microwave and Millimeter-Wave Transmit/Receive Switch Subsystem for Communication Systems

Authors: Donghyun Lee, Cam Nguyen

Abstract:

Multi-band systems offer a great deal of benefit in modern communication and radar systems. In particular, multi-band antenna-array radar systems with their extended frequency diversity provide numerous advantages in detection, identification, locating and tracking a wide range of targets, including enhanced detection coverage, accurate target location, reduced survey time and cost, increased resolution, improved reliability and target information. An accurate calibration is a critical issue in antenna array systems. The amplitude and phase errors in multi-band and multi-polarization antenna array transceivers result in inaccurate target detection, deteriorated resolution and reduced reliability. Furthermore, the digital beam former without the RF domain phase-shifting is less immune to unfiltered interference signals, which can lead to receiver saturation in array systems. Therefore, implementing integrated front-end architecture, which can support calibration function with low insertion and filtering function from the farthest end of an array transceiver is of great interest. We report a dual K/Ka-band T/R/Calibration switch module with quasi-elliptic dual-bandpass filtering function implementing a Q-enhanced metamaterial transmission line. A unique dual-band frequency response is incorporated in the reception and calibration path of the proposed switch module utilizing the composite right/left-handed meta material transmission line coupled with a Colpitts-style negative generation circuit. The fabricated fully integrated T/R/Calibration switch module in 0.18-μm BiCMOS technology exhibits insertion loss of 4.9-12.3 dB and isolation of more than 45 dB in the reception, transmission and calibration mode of operation. In the reception and calibration mode, the dual-band frequency response centered at 24.5 and 35 GHz exhibits out-of-band rejection of more than 30 dB compared to the pass bands below 10.5 GHz and above 59.5 GHz. The rejection between the pass bands reaches more than 50 dB. In all modes of operation, the IP1-dB is between 4 and 11 dBm. Acknowledgement: This paper was made possible by NPRP grant # 6-241-2-102 from the Qatar National Research Fund (a member of Qatar Foundation). The statements made herein are solely the responsibility of the authors.

Keywords: microwaves, millimeter waves, T/R switch, wireless communications, wireless communications

Procedia PDF Downloads 149
5383 Thresholding Approach for Automatic Detection of Pseudomonas aeruginosa Biofilms from Fluorescence in situ Hybridization Images

Authors: Zonglin Yang, Tatsuya Akiyama, Kerry S. Williamson, Michael J. Franklin, Thiruvarangan Ramaraj

Abstract:

Pseudomonas aeruginosa is an opportunistic pathogen that forms surface-associated microbial communities (biofilms) on artificial implant devices and on human tissue. Biofilm infections are difficult to treat with antibiotics, in part, because the bacteria in biofilms are physiologically heterogeneous. One measure of biological heterogeneity in a population of cells is to quantify the cellular concentrations of ribosomes, which can be probed with fluorescently labeled nucleic acids. The fluorescent signal intensity following fluorescence in situ hybridization (FISH) analysis correlates to the cellular level of ribosomes. The goals here are to provide computationally and statistically robust approaches to automatically quantify cellular heterogeneity in biofilms from a large library of epifluorescent microscopy FISH images. In this work, the initial steps were developed toward these goals by developing an automated biofilm detection approach for use with FISH images. The approach allows rapid identification of biofilm regions from FISH images that are counterstained with fluorescent dyes. This methodology provides advances over other computational methods, allowing subtraction of spurious signals and non-biological fluorescent substrata. This method will be a robust and user-friendly approach which will enable users to semi-automatically detect biofilm boundaries and extract intensity values from fluorescent images for quantitative analysis of biofilm heterogeneity.

Keywords: image informatics, Pseudomonas aeruginosa, biofilm, FISH, computer vision, data visualization

Procedia PDF Downloads 121
5382 Recognizing Customer Preferences Using Review Documents: A Hybrid Text and Data Mining Approach

Authors: Oshin Anand, Atanu Rakshit

Abstract:

The vast increment in the e-commerce ventures makes this area a prominent research stream. Besides several quantified parameters, the textual content of reviews is a storehouse of many information that can educate companies and help them earn profit. This study is an attempt in this direction. The article attempts to categorize data based on a computed metric that quantifies the influencing capacity of reviews rendering two categories of high and low influential reviews. Further, each of these document is studied to conclude several product feature categories. Each of these categories along with the computed metric is converted to linguistic identifiers and are used in an association mining model. The article makes a novel attempt to combine feature attraction with quantified metric to categorize review text and finally provide frequent patterns that depict customer preferences. Frequent mentions in a highly influential score depict customer likes or preferred features in the product whereas prominent pattern in low influencing reviews highlights what is not important for customers. This is achieved using a hybrid approach of text mining for feature and term extraction, sentiment analysis, multicriteria decision-making technique and association mining model.

Keywords: association mining, customer preference, frequent pattern, online reviews, text mining

Procedia PDF Downloads 376
5381 Evaluating Traffic Congestion Using the Bayesian Dirichlet Process Mixture of Generalized Linear Models

Authors: Ren Moses, Emmanuel Kidando, Eren Ozguven, Yassir Abdelrazig

Abstract:

This study applied traffic speed and occupancy to develop clustering models that identify different traffic conditions. Particularly, these models are based on the Dirichlet Process Mixture of Generalized Linear regression (DML) and change-point regression (CR). The model frameworks were implemented using 2015 historical traffic data aggregated at a 15-minute interval from an Interstate 295 freeway in Jacksonville, Florida. Using the deviance information criterion (DIC) to identify the appropriate number of mixture components, three traffic states were identified as free-flow, transitional, and congested condition. Results of the DML revealed that traffic occupancy is statistically significant in influencing the reduction of traffic speed in each of the identified states. Influence on the free-flow and the congested state was estimated to be higher than the transitional flow condition in both evening and morning peak periods. Estimation of the critical speed threshold using CR revealed that 47 mph and 48 mph are speed thresholds for congested and transitional traffic condition during the morning peak hours and evening peak hours, respectively. Free-flow speed thresholds for morning and evening peak hours were estimated at 64 mph and 66 mph, respectively. The proposed approaches will facilitate accurate detection and prediction of traffic congestion for developing effective countermeasures.

Keywords: traffic congestion, multistate speed distribution, traffic occupancy, Dirichlet process mixtures of generalized linear model, Bayesian change-point detection

Procedia PDF Downloads 279
5380 Performance of Osmotic Microbial Fuel Cell in Wastewater Treatment and Electricity Generation: A Critical Review

Authors: Shubhangi R. Deshmukh, Anupam B. Soni

Abstract:

Clean water and electricity are vital services needed in all communities. Bio-degradation of wastewater contaminants and desalination technologies are the best possible alternatives for the global shortage of fresh water supply. Osmotic microbial fuel cell (OMFC) is a versatile technology that uses microorganism (used for biodegradation of organic waste) and membrane technology (used for water purification) for wastewater treatment and energy generation simultaneously. This technology is the combination of microbial fuel cell (MFC) and forward osmosis (FO) processes. OMFC can give more electricity and clean water than the MFC which has a regular proton exchange membrane. FO gives many improvements such as high contamination removal, lower operating energy, raising high proton flux than other pressure-driven membrane technology. Lower concentration polarization lowers the membrane fouling by giving osmotic water recovery without extra cost. In this review paper, we have discussed the principle, mechanism, limitation, and application of OMFC technology reported to date. Also, we have interpreted the experimental data from various literature on the water recovery and electricity generation assessed by a different component of OMFC. The area of producing electricity using OMFC has further scope for research and seems like a promising route to wastewater treatment.

Keywords: forward osmosis, microbial fuel cell, osmotic microbial fuel cell, wastewater treatment

Procedia PDF Downloads 169
5379 The Role of Extrovert and Introvert Personality in Second Language Acquisition

Authors: Fatma Hsain Ali Suliman

Abstract:

Personality plays an important role in acquiring a second language. For second language learners to make maximum progress with their own learning styles, their individual differences must be recognized and attended to. Personality is considered to be a pattern of unique characteristics that give a person’s behavior a kind of consistency and individuality. Therefore, the enclosed study, which is entitled “The Role of Personality in Second language Acquisition: Extroversion and Introversion”, tends to shed light on the relationship between learners’ personalities and second language acquisition process. In other words, it aims at drawing attention to how individual differences of students as being extroverts or introverts could affect the language acquisition process. As a literature review, this paper discusses the results of some studies concerning this issue as well as the point views of researchers and scholars who have focused on the effect of extrovert and introvert personality on acquiring a second language. To accomplish the goals of this study, which is divided into 5 chapters including introduction, review of related literature, research method and design, results and discussions and conclusions and recommendations, 20 students of English Department, Faculty of Arts, Misurata University, Libya were handed out a questionnaire to figure out the effect of their personalities on the learning process. Finally, to be more sure about the role of personality in a second language acquisition process, the same students who were given the questionnaire were observed in their ESL classes.

Keywords: second language acquisition, personality, extroversion, introversion, individual differences, language learning strategy, personality factors, psycho linguistics

Procedia PDF Downloads 630
5378 Building User Behavioral Models by Processing Web Logs and Clustering Mechanisms

Authors: Madhuka G. P. D. Udantha, Gihan V. Dias, Surangika Ranathunga

Abstract:

Today Websites contain very interesting applications. But there are only few methodologies to analyze User navigations through the Websites and formulating if the Website is put to correct use. The web logs are only used if some major attack or malfunctioning occurs. Web Logs contain lot interesting dealings on users in the system. Analyzing web logs has become a challenge due to the huge log volume. Finding interesting patterns is not as easy as it is due to size, distribution and importance of minor details of each log. Web logs contain very important data of user and site which are not been put to good use. Retrieving interesting information from logs gives an idea of what the users need, group users according to their various needs and improve site to build an effective and efficient site. The model we built is able to detect attacks or malfunctioning of the system and anomaly detection. Logs will be more complex as volume of traffic and the size and complexity of web site grows. Unsupervised techniques are used in this solution which is fully automated. Expert knowledge is only used in validation. In our approach first clean and purify the logs to bring them to a common platform with a standard format and structure. After cleaning module web session builder is executed. It outputs two files, Web Sessions file and Indexed URLs file. The Indexed URLs file contains the list of URLs accessed and their indices. Web Sessions file lists down the indices of each web session. Then DBSCAN and EM Algorithms are used iteratively and recursively to get the best clustering results of the web sessions. Using homogeneity, completeness, V-measure, intra and inter cluster distance and silhouette coefficient as parameters these algorithms self-evaluate themselves to input better parametric values to run the algorithms. If a cluster is found to be too large then micro-clustering is used. Using Cluster Signature Module the clusters are annotated with a unique signature called finger-print. In this module each cluster is fed to Associative Rule Learning Module. If it outputs confidence and support as value 1 for an access sequence it would be a potential signature for the cluster. Then the access sequence occurrences are checked in other clusters. If it is found to be unique for the cluster considered then the cluster is annotated with the signature. These signatures are used in anomaly detection, prevent cyber attacks, real-time dashboards that visualize users, accessing web pages, predict actions of users and various other applications in Finance, University Websites, News and Media Websites etc.

Keywords: anomaly detection, clustering, pattern recognition, web sessions

Procedia PDF Downloads 275
5377 A Review Investigating the Potential Of Zooxanthellae to Be Genetically Engineered to Combat Coral Bleaching

Authors: Anuschka Curran, Sandra Barnard

Abstract:

Coral reefs are of the most diverse and productive ecosystems on the planet, but due to the impact of climate change, these infrastructures are dying off primarily through coral bleaching. Coral bleaching can be described as the process by which zooxanthellae (algal endosymbionts) are expelled from the gastrodermal cavity of the respective coral host, causing increased coral whitening. The general consensus is that mass coral bleaching is due to the dysfunction of photosynthetic processes in the zooxanthellae as a result of the combined action of elevated temperature and light-stress. The question then is, do zooxanthellae have the potential to play a key role in the future of coral reef restoration through genetic engineering? The aim of this study is firstly to review the different zooxanthellae taxa and their traits with respect to environmental stress, and secondly, to review the information available on the protective mechanisms present in zooxanthellae cells when experiencing temperature fluctuations, specifically concentrating on heat shock proteins and the antioxidant stress response of zooxanthellae. The eight clades (A-H) previously recognized were redefined into seven genera. Different zooxanthellae taxa exhibit different traits, such as their photosynthetic stress responses to light and temperature. Zooxanthellae have the ability to determine the amount and type of heat shock proteins (hsps) present during a heat response. The zooxanthellae can regulate both the host’s respective hsps as well as their own. Hsps, generally found in genotype C3 zooxanthellae, such as Hsp70 and Hsp90, contribute to the thermal stress response of the respective coral host. Antioxidant activity found both within exposed coral tissue, and the zooxanthellae cells can prevent coral hosts from expelling their endosymbionts. The up-regulation of gene expression, which may mitigate thermal stress induction of any of the physiological aspects discussed, can ensure stable coral-zooxanthellae symbiosis in the future. It presents a viable alternative strategy to preserve reefs amidst climate change. In conclusion, despite their unusual molecular design, genetic engineering poses as a useful tool in understanding and manipulating variables and systems within zooxanthellae and therefore presents a solution that can ensure stable coral-zooxanthellae symbiosis in the future.

Keywords: antioxidant enzymes, genetic engineering, heat-shock proteins, Symbiodinium

Procedia PDF Downloads 173
5376 Unlocking the Power of Social Media for Tourism Marketing: How Travel Bloggers Shape Destination Trust, Travel Intention with the Moderating Role of Trustworthiness on Social Media Posts

Authors: Saad Saif

Abstract:

Tourism promotion in the digital age is significantly influenced by social media, particularly in developing travel markets such as Pakistan. This study examines how travel bloggers use social media to inspire people to plan journeys and increase trust in destinations. It examines how trustworthiness works as a moderator to enhance the legitimacy of social media posts. This study aims to comprehend the dynamics of social media's influence on the travel and tourism industry. This study investigates the influence of travel bloggers' content, with a focus on tone (positive/negative) and emotional intensity (strong/weak), on prospective Pakistani travelers' travel preferences and levels of trust toward a particular location. The study used an experimental design to validate its hypotheses. The results indicate that the emotive content and tone of bloggers influence travel intentions and that destination trust mediates this relationship. It is interesting to observe that variations in the emotional intensity of positive and negative ratings are not always accompanied by changes in destination trust and travel intent. In addition, the influence of a blogger's review tone on travel intention and destination trust is moderated by the credibility of online reviews, whereas the influence of emotional intensity on these outcomes is unaffected by review credibility.

Keywords: tourism marketing, destination trust, travel intention, trustworthiness

Procedia PDF Downloads 58
5375 The Retrospective Investigation of the Impacts of Alien Taxa on Human Health: A Case Study of Two Poison Information Centers

Authors: Moleseng Claude Moshobane

Abstract:

Alien species cause considerable negative impacts on biodiversity, economy and public health. Impacts of alien species on public health have received a degree of attention worldwide, largely in developed countries, but scarce in developing countries. Here, we provide a review of human exposures and poisonings cases from native and alien plant species reported to poison information centers. A retrospective review of the Tygerberg Poison Information Centre (TPIC) and Poisons Information Centre (PIC) at Red Cross War Memorial Children's Hospital (RCWMCH) was conducted over approximately 2-year period (1 June 2015 through to 06 March 2017). Combined, TPIC and PIC handled 626 cases during the 2-year period. Toxicity cases were more abundant in Gauteng (47.1%), followed by Western Cape (29.4%). The primary mechanism of injury was ingestion (96.7%), and all cases were predominantly accidental. Most reported cases involved infants (20.6%), with few fully-grown adults related cases (5.8%). Adults presented minor to moderate toxicity, while infants none to minor toxicity. We conclude that reported toxicity cases on human health are biased towards few alien species and that several cases relate to unknown species of mushrooms. Public awareness is essential to reducing the poisoning incidences.

Keywords: alien species, poisoning, invasive species, public health

Procedia PDF Downloads 167
5374 Design and Simulation of a Radiation Spectrometer Using Scintillation Detectors

Authors: Waleed K. Saib, Abdulsalam M. Alhawsawi, Essam Banoqitah

Abstract:

The idea of this research is to design a radiation spectrometer using LSO scintillation detector coupled to a C series of SiPM (silicon photomultiplier). The device can be used to detects gamma and X-ray radiation. This device is also designed to estimates the activity of the source contamination. The SiPM will detect light in the visible range above the threshold and read them as counts. Three gamma sources were used for these experiments Cs-137, Am-241 and Co-60 with various activities. These sources are applied for four experiments operating the SiPM as a spectrometer, energy resolution, pile-up set and efficiency. The SiPM is connected to a MCA to perform as a spectrometer. Cerium doped Lutetium Silicate (Lu₂SiO₅) with light yield 26000 photons/Mev coupled with the SiPM. As a result, all the main features of the Cs-137, Am-241 and Co-60 are identified in MCA. The experiment shows how photon energy and probability of interaction are inversely related. Total attenuation reduces as photon energy increases. An analytical calculation was made to obtain the FWHM resolution for each gamma source. The FWHM resolution for Am-241 (59 keV) is 28.75 %, for Cs-137 (662 keV) is 7.85 %, for Co-60 (1173 keV) is 4.46 % and for Co-60 (1332 keV) is 3.70%. Moreover, the experiment shows that the dead time and counts number decreased when the pile-up rejection was disabled and the FWHM decreased when the pile-up was enabled. The efficiencies were calculated at four different distances from the detector 2, 4, 8 and 16 cm. The detection efficiency was observed to declined exponentially with increasing distance from the detector face. Conclusively, the SiPM board operated with an LSO scintillator crystal as a spectrometer. The SiPM energy resolution for the three gamma sources used was a decent comparison to other PMTs.

Keywords: PMT, radiation, radiation detection, scintillation detectors, silicon photomultiplier, spectrometer

Procedia PDF Downloads 142
5373 Model-Based Fault Diagnosis in Carbon Fiber Reinforced Composites Using Particle Filtering

Authors: Hong Yu, Ion Matei

Abstract:

Carbon fiber reinforced composites (CFRP) used as aircraft structure are subject to lightning strike, putting structural integrity under risk. Indirect damage may occur after a lightning strike where the internal structure can be damaged due to excessive heat induced by lightning current, while the surface of the structures remains intact. Three damage modes may be observed after a lightning strike: fiber breakage, inter-ply delamination and intra-ply cracks. The assessment of internal damage states in composite is challenging due to complicated microstructure, inherent uncertainties, and existence of multiple damage modes. In this work, a model based approach is adopted to diagnose faults in carbon composites after lighting strikes. A resistor network model is implemented to relate the overall electrical and thermal conduction behavior under simulated lightning current waveform to the intrinsic temperature dependent material properties, microstructure and degradation of materials. A fault detection and identification (FDI) module utilizes the physics based model and a particle filtering algorithm to identify damage mode as well as calculate the probability of structural failure. Extensive simulation results are provided to substantiate the proposed fault diagnosis methodology with both single fault and multiple faults cases. The approach is also demonstrated on transient resistance data collected from a IM7/Epoxy laminate under simulated lightning strike.

Keywords: carbon composite, fault detection, fault identification, particle filter

Procedia PDF Downloads 181
5372 Functions and Pathophysiology of the Ventricular System: Review of the Underlying Basic Physics

Authors: Mohamed Abdelrahman Abdalla

Abstract:

Apart from their function in producing CSF, the brain ventricles have been recognized as the mere remnant of the embryological neural tube with no clear role. The lack of proper definition of the function of the brain ventricles and the central spinal canal has made it difficult to ascertain the pathophysiology of its different disease conditions or to treat them. This study aims to review the simple physics that could explain the basic function of the CNS ventricular system and to suggest new ways of approaching its pathology. There are probably more physical factors to consider than only the pressure. Monro-Killie hypothesis focuses on volume and subsequently pressure to direct our surgical management in different disease conditions. However, the enlarged volume of the ventricles in normal pressure hydrocephalus does not move any blood or brain outside the skull. Also, in idiopathic intracranial hypertension, the very high intracranial pressure rarely causes brain herniation. On this note, the continuum of the intracranial cavity with the spinal canal makes it a whole unit and hence the defect in the theory. In this study, adding different factors to the equation like brain and CSF density and positions of the brain in space, in addition to the volume and pressure, aims to identify how the ventricles are important in the CNS homeostasis. In addition, increasing the variables that we analyze to treat different CSF pathological conditions should increase our understanding and hence accuracy of treatment of such conditions.

Keywords: communicating hydrocephalus, functions of the ventricles, idiopathic intracranial hypertension physics of CSF

Procedia PDF Downloads 81
5371 Advanced Magnetic Resonance Imaging in Differentiation of Neurocysticercosis and Tuberculoma

Authors: Rajendra N. Ghosh, Paramjeet Singh, Niranjan Khandelwal, Sameer Vyas, Pratibha Singhi, Naveen Sankhyan

Abstract:

Background: Tuberculoma and neurocysticercosis (NCC) are two most common intracranial infections in developing country. They often simulate on neuroimaging and in absence of typical imaging features cause significant diagnostic dilemmas. Differentiation is extremely important to avoid empirical exposure to antitubercular medications or nonspecific treatment causing disease progression. Purpose: Better characterization and differentiation of CNS tuberculoma and NCC by using morphological and multiple advanced functional MRI. Material and Methods: Total fifty untreated patients (20 tuberculoma and 30 NCC) were evaluated by using conventional and advanced sequences like CISS, SWI, DWI, DTI, Magnetization transfer (MT), T2Relaxometry (T2R), Perfusion and Spectroscopy. rCBV,ADC,FA,T2R,MTR values and metabolite ratios were calculated from lesion and normal parenchyma. Diagnosis was confirmed by typical biochemical, histopathological and imaging features. Results: CISS was most useful sequence for scolex detection (90% on CISS vs 73% on routine sequences). SWI showed higher scolex detection ability. Mean values of ADC, FA,T2R from core and rCBV from wall of lesion were significantly different in tuberculoma and NCC (P < 0.05). Mean values of rCBV, ADC, T2R and FA for tuberculoma and NCC were (3.36 vs1.3), (1.09x10⁻³vs 1.4x10⁻³), (0.13 x10⁻³ vs 0.09 x10⁻³) and (88.65 ms vs 272.3 ms) respectively. Tuberculomas showed high lipid peak, more choline and lower creatinine with Ch/Cr ratio > 1. T2R value was most significant parameter for differentiation. Cut off values for each significant parameters have proposed. Conclusion: Quantitative MRI in combination with conventional sequences can better characterize and differentiate similar appearing tuberculoma and NCC and may be incorporated in routine protocol which may avoid brain biopsy and empirical therapy.

Keywords: advanced functional MRI, differentiation, neurcysticercosis, tuberculoma

Procedia PDF Downloads 547
5370 An Equitable Strategy to Amend Zero-Emission Vehicles Incentives for Travelers: A Policy Review

Authors: Marie Louis

Abstract:

Even though many stakeholders are doing their very best to promote public transportation around the world, many areas are still public transportation non-accessible. With travelers purchasing and driving their private vehicles can be considered as a threat to all three aspects of the sustainability (e.g., economical, social, environmental). However, most studies that considered simultaneously all three aspects of the sustainability concept when planning and designing public transportation for a corridor have found tradeoffs among the said three aspects.One of the tradeoffs was identified by looking at tipping points of the travel demands to question whether transit agencies/and or transportation policymakers should either operate smaller buses or provide incentives to purchase Leadership in Energy and Environmental Design (LEED)-Qualified low-emission vehicles or greener vehicles (e.g., hybrid). However, how and when do the department of environmental protection (DEP) and the department of revenue (DOR) figure out how much incentives to give to each traveler who lives in a zoning that is considered as public transportation inaccessible or accessible? To answer this policy question, this study aims to compare the greenhouse gases (GHGs) emissions when hybrid and conventional cars are used to access public transportation stops/stations. Additionally, this study also intends to review previous states that have already adopted low-emissions vehicle (LEVs) or Zero-Emissions Vehicles (ZEVs) to diminish the daily GHGs pollutants.

Keywords: LEED-qualified vehicles, public transit accessibility, hybrid vehicles incentives, sustainability trade-offs

Procedia PDF Downloads 183
5369 A Theoretical Framework for Conceptualizing Integration of Environmental Sustainability into Supplier Selection

Authors: Tonny Ograh, Joshua Ayarkwa, Dickson Osei-Asibey, Alex Acheampong, Peter Amoah

Abstract:

Theories are used to improve the conceptualization of research ideas. These theories enhance valuable elucidations that help us to grasp the meaning of research findings. Nevertheless, the use of theories to promote studies in green supplier selection in procurement decisions has attracted little attention. With the emergence of sustainable procurement, public procurement practitioners in Ghana are yet to achieve relevant knowledge on green supplier selections due to insufficient knowledge and inadequate appropriate frameworks. The flagrancy of the consequences of public procurers’ failure to integrate environmental considerations into supplier selection explains the adoption of a multi-theory approach for comprehension of the dynamics of green integration into supplier selection. In this paper, the practicality of three theories for improving the understanding of the influential factors enhancing the integration of environmental sustainability into supplier selection was reviewed. The three theories are Resource-Based Theory, Human Capital Theory and Absorptive Capacity Theory. This review uncovered knowledge management, top management commitment, and environmental management capabilities as important elements needed for the integration of environmental sustainability into supplier selection in public procurement. The theoretical review yielded a framework that conceptualizes knowledge and capabilities of practitioners relevant to the incorporation of environmental sustainability into supplier selection in public procurement.

Keywords: environmental, sustainability, supplier selection, environmental procurement, sustainable procurement

Procedia PDF Downloads 161
5368 The Virtues and Vices of Leader Empathy: A Review of a Misunderstood Construct

Authors: John G. Vongas, Raghid Al Hajj

Abstract:

In recent years, there has been a surge in research on empathy across disciplines ranging from management and psychology to philosophy and neuroscience. In organizational behavior, in particular, scholars have become interested in leader empathy given the rise of workplace diversity and the growing perception of leaders as managers of group emotions. It would appear that the current zeitgeist in behavioral and philosophical science is that empathy is a cornerstone of morality and that our world would be better off if only more people – and by extension, more leaders – were empathic. In spite of these claims, however, researchers have used different terminologies to explore empathy, confusing it at times with other related constructs such as emotional intelligence and compassion. Second, extant research that specifies what empathic leaders do and how their behavior affects organizational stakeholders, including themselves, does not devolve from a unifying theoretical framework. These problems plague knowledge development in this important research domain. Therefore, to the authors' best knowledge, this paper provides the first comprehensive review and synthesis of the literature on leader empathy by drawing on disparate yet complementary fields of inquiry. It clarifies empathy from other constructs and presents a theoretical model that elucidates the mechanisms by which a leader’s empathy translates into behaviors that could be either beneficial or harmful to the leaders themselves, as well as to their followers and groups. And third, it specifies the boundary conditions under which a leader’s empathy will become manifest. Finally, it suggests ways in which training could be implemented to improve empathy in practice while also remaining skeptical of its conceptualization as a moral or even effective guide in human affairs.

Keywords: compassion, empathy, leadership, group outcomes

Procedia PDF Downloads 119
5367 A Qualitative Review and Meta-Analyses of Published Literature Exploring Rates and Reasons Behind the Choice of Elective Caesarean Section in Pregnant Women With No Contraindication to Trial of Labor After One Previous Caesarean Section

Authors: Risheka Suthantirakumar, Eilish Pearson, Jacqueline Woodman

Abstract:

Background: Previous research has found a variety of rates and reasons for choosing medically unindicated elective repeat cesarean section (ERCS). Understanding the frequency and reasoning of ERCS, especially when unwarranted, could help healthcare professionals better tailor their advice and service. Therefore, our study conducted meta-analyses and qualitative analyses to identify the reasons and rates worldwide for choosing this procedure over the trial of labor after cesarean (TOLAC), also referred to in published literature as vaginal birth after cesarean (VBAC). Methods: We conducted a systematic review of published literature available on PubMed, EMBASE, and science.gov and conducted a blinded peer review process to assess eligibility. Search terms were created in collaboration with experts in the field. An inclusion and exclusion criteria were established prior to reviewing the articles. Included studies were limited to those published in English due to author constraints, although no international boundaries were used in the search. No time limit for the search was used in order to portray changes over time. Results: Our qualitative analyses found five consistent themes across international studies, which were socioeconomic and cultural differences, previous cesarean experience, perceptions of risk with vaginal birth, patients’ perceptions of future benefits, and medical advice and information. Our meta-analyses found variable rates of ERCS across international borders and within national populations. The average rate across all studies was 44% (CI 95% 36-51). Discussion: The studies included in our qualitative analysis demonstrated similar repetitive themes, which give validity to the findings across the studies included. We consider the rate variation across and within national populations to be partially a result of differing inclusion and eligibility assessment between different studies and argue that a proforma be utilized for future research to be comparable.

Keywords: elective cesarean section, VBAC, TOLAC, maternal choice

Procedia PDF Downloads 104
5366 Automatic Detection of Traffic Stop Locations Using GPS Data

Authors: Areej Salaymeh, Loren Schwiebert, Stephen Remias, Jonathan Waddell

Abstract:

Extracting information from new data sources has emerged as a crucial task in many traffic planning processes, such as identifying traffic patterns, route planning, traffic forecasting, and locating infrastructure improvements. Given the advanced technologies used to collect Global Positioning System (GPS) data from dedicated GPS devices, GPS equipped phones, and navigation tools, intelligent data analysis methodologies are necessary to mine this raw data. In this research, an automatic detection framework is proposed to help identify and classify the locations of stopped GPS waypoints into two main categories: signalized intersections or highway congestion. The Delaunay triangulation is used to perform this assessment in the clustering phase. While most of the existing clustering algorithms need assumptions about the data distribution, the effectiveness of the Delaunay triangulation relies on triangulating geographical data points without such assumptions. Our proposed method starts by cleaning noise from the data and normalizing it. Next, the framework will identify stoppage points by calculating the traveled distance. The last step is to use clustering to form groups of waypoints for signalized traffic and highway congestion. Next, a binary classifier was applied to find distinguish highway congestion from signalized stop points. The binary classifier uses the length of the cluster to find congestion. The proposed framework shows high accuracy for identifying the stop positions and congestion points in around 99.2% of trials. We show that it is possible, using limited GPS data, to distinguish with high accuracy.

Keywords: Delaunay triangulation, clustering, intelligent transportation systems, GPS data

Procedia PDF Downloads 261
5365 Deep Learning-Based Classification of 3D CT Scans with Real Clinical Data; Impact of Image format

Authors: Maryam Fallahpoor, Biswajeet Pradhan

Abstract:

Background: Artificial intelligence (AI) serves as a valuable tool in mitigating the scarcity of human resources required for the evaluation and categorization of vast quantities of medical imaging data. When AI operates with optimal precision, it minimizes the demand for human interpretations and, thereby, reduces the burden on radiologists. Among various AI approaches, deep learning (DL) stands out as it obviates the need for feature extraction, a process that can impede classification, especially with intricate datasets. The advent of DL models has ushered in a new era in medical imaging, particularly in the context of COVID-19 detection. Traditional 2D imaging techniques exhibit limitations when applied to volumetric data, such as Computed Tomography (CT) scans. Medical images predominantly exist in one of two formats: neuroimaging informatics technology initiative (NIfTI) and digital imaging and communications in medicine (DICOM). Purpose: This study aims to employ DL for the classification of COVID-19-infected pulmonary patients and normal cases based on 3D CT scans while investigating the impact of image format. Material and Methods: The dataset used for model training and testing consisted of 1245 patients from IranMehr Hospital. All scans shared a matrix size of 512 × 512, although they exhibited varying slice numbers. Consequently, after loading the DICOM CT scans, image resampling and interpolation were performed to standardize the slice count. All images underwent cropping and resampling, resulting in uniform dimensions of 128 × 128 × 60. Resolution uniformity was achieved through resampling to 1 mm × 1 mm × 1 mm, and image intensities were confined to the range of (−1000, 400) Hounsfield units (HU). For classification purposes, positive pulmonary COVID-19 involvement was designated as 1, while normal images were assigned a value of 0. Subsequently, a U-net-based lung segmentation module was applied to obtain 3D segmented lung regions. The pre-processing stage included normalization, zero-centering, and shuffling. Four distinct 3D CNN models (ResNet152, ResNet50, DensNet169, and DensNet201) were employed in this study. Results: The findings revealed that the segmentation technique yielded superior results for DICOM images, which could be attributed to the potential loss of information during the conversion of original DICOM images to NIFTI format. Notably, ResNet152 and ResNet50 exhibited the highest accuracy at 90.0%, and the same models achieved the best F1 score at 87%. ResNet152 also secured the highest Area under the Curve (AUC) at 0.932. Regarding sensitivity and specificity, DensNet201 achieved the highest values at 93% and 96%, respectively. Conclusion: This study underscores the capacity of deep learning to classify COVID-19 pulmonary involvement using real 3D hospital data. The results underscore the significance of employing DICOM format 3D CT images alongside appropriate pre-processing techniques when training DL models for COVID-19 detection. This approach enhances the accuracy and reliability of diagnostic systems for COVID-19 detection.

Keywords: deep learning, COVID-19 detection, NIFTI format, DICOM format

Procedia PDF Downloads 63
5364 Constraints to Partnership Based Financing in Islamic Banks: A Systematic Review of Literature

Authors: Muhammad Nouman, Salim Gul, Karim Ullah

Abstract:

Partnership has been understood as the essence of Islamic banking. However, in practice, the non-partnership paradigm dominates the operations of Islamic banks. Islamic banks adopt partnership contracts for the scheme of deposits, especially for term deposit accounts. However, they do not adopt partnership contracts (i.e., Musharakah and Mudarabah) as the main financing scheme. In practice, non-partnership contracts including Murabahah and Ijara are widely used for financing. Many authors have provided different explanations for the less utilization of the partnership contracts as a scheme of financing. However, the typology of constraints remains missing. The extant literature remains scattered, with diverse studies focused on different dimensions of the issue. Therefore, there is no unified understanding of the constraints in the application of the partnership contracts. This paper aims to highlight the major factors hindering the application of partnership contracts, and produce a coherent view by synthesizing different explanations provided in several studies conducted around the globe. The present study employs insights form the extant literature using a systematic review and provides academia, practitioners, and policy makers with a holistic framework to name and make sense of what is making partnership contracts a less attractive option for Islamic banks. A total of 84 relevant publications including 11 books, 14 chapters of edited books, 48 journal articles, 8 conference papers and 3 IMF working papers were selected using a systematic procedure. Analysis of these selected publications followed three steps: i) In the first step of analysis the constraints explicitly appearing in the literature set of 84 articles were extracted, ii) In the second step 27 factors hindering the application of partnership contracts were identified from the constraints extracted in the first step with the overlapping items either eliminated or combined, iii) In the last step the factors identified in the second step were classified into three distinct categories. Our intention was to develop the typology of constraints by connecting the rather abstract concepts into the broader sets of constraints for better conceptualization and policy implications. Our framework highlights that there are mainly three facets of lower preference for partnership contracts of financing. First, there are several factors in the contemporary business settings, prevailing social setting, and the bank’s internal environment that underpin uncertainty in the success of partnership contracts of financing. Second, partnership contracts have lower demand i.e., entrepreneurs prefer to use non-partnership contracts for financing their ventures due to the inherent restraining characteristics of the partnership contracts. Finally, there are certain factors in the regulatory framework that restraint the extensive utilization of partnership contracts of financing by Islamic banks. The present study contributes to the Islamic banking literature in many ways. It provides clarification to the heavily criticized operations of Islamic banks, integrates the scattered literature, and provides a holistic framework for better conceptualization of the key constraints in the application of the partnership contracts and policy implications. Moreover, it demonstrates an application of systematic review in Islamic banking research.

Keywords: Islamic banking, Islamic finance, Mudarabah, Musharakah, partnership, systematic review

Procedia PDF Downloads 257