Search results for: constrained clustering
508 Multidimensional Item Response Theory Models for Practical Application in Large Tests Designed to Measure Multiple Constructs
Authors: Maria Fernanda Ordoñez Martinez, Alvaro Mauricio Montenegro
Abstract:
This work presents a statistical methodology for measuring and founding constructs in Latent Semantic Analysis. This approach uses the qualities of Factor Analysis in binary data with interpretations present on Item Response Theory. More precisely, we propose initially reducing dimensionality with specific use of Principal Component Analysis for the linguistic data and then, producing axes of groups made from a clustering analysis of the semantic data. This approach allows the user to give meaning to previous clusters and found the real latent structure presented by data. The methodology is applied in a set of real semantic data presenting impressive results for the coherence, speed and precision.Keywords: semantic analysis, factorial analysis, dimension reduction, penalized logistic regression
Procedia PDF Downloads 444507 Resource Constrained Time-Cost Trade-Off Analysis in Construction Project Planning and Control
Authors: Sangwon Han, Chengquan Jin
Abstract:
Time-cost trade-off (TCTO) is one of the most significant part of construction project management. Despite the significance, current TCTO analysis, based on the Critical Path Method, does not consider resource constraint, and accordingly sometimes generates an impractical and/or infeasible schedule planning in terms of resource availability. Therefore, resource constraint needs to be considered when doing TCTO analysis. In this research, genetic algorithms (GA) based optimization model is created in order to find the optimal schedule. This model is utilized to compare four distinct scenarios (i.e., 1) initial CPM, 2) TCTO without considering resource constraint, 3) resource allocation after TCTO, and 4) TCTO with considering resource constraint) in terms of duration, cost, and resource utilization. The comparison results identify that ‘TCTO with considering resource constraint’ generates the optimal schedule with the respect of duration, cost, and resource. This verifies the need for consideration of resource constraint when doing TCTO analysis. It is expected that the proposed model will produce more feasible and optimal schedule.Keywords: time-cost trade-off, genetic algorithms, critical path, resource availability
Procedia PDF Downloads 188506 Dissimilarity-Based Coloring for Symbolic and Multivariate Data Visualization
Authors: K. Umbleja, M. Ichino, H. Yaguchi
Abstract:
In this paper, we propose a coloring method for multivariate data visualization by using parallel coordinates based on dissimilarity and tree structure information gathered during hierarchical clustering. The proposed method is an extension for proximity-based coloring that suffers from a few undesired side effects if hierarchical tree structure is not balanced tree. We describe the algorithm by assigning colors based on dissimilarity information, show the application of proposed method on three commonly used datasets, and compare the results with proximity-based coloring. We found our proposed method to be especially beneficial for symbolic data visualization where many individual objects have already been aggregated into a single symbolic object.Keywords: data visualization, dissimilarity-based coloring, proximity-based coloring, symbolic data
Procedia PDF Downloads 171505 Discovering the Effects of Meteorological Variables on the Air Quality of Bogota, Colombia, by Data Mining Techniques
Authors: Fabiana Franceschi, Martha Cobo, Manuel Figueredo
Abstract:
Bogotá, the capital of Colombia, is its largest city and one of the most polluted in Latin America due to the fast economic growth over the last ten years. Bogotá has been affected by high pollution events which led to the high concentration of PM10 and NO2, exceeding the local 24-hour legal limits (100 and 150 g/m3 each). The most important pollutants in the city are PM10 and PM2.5 (which are associated with respiratory and cardiovascular problems) and it is known that their concentrations in the atmosphere depend on the local meteorological factors. Therefore, it is necessary to establish a relationship between the meteorological variables and the concentrations of the atmospheric pollutants such as PM10, PM2.5, CO, SO2, NO2 and O3. This study aims to determine the interrelations between meteorological variables and air pollutants in Bogotá, using data mining techniques. Data from 13 monitoring stations were collected from the Bogotá Air Quality Monitoring Network within the period 2010-2015. The Principal Component Analysis (PCA) algorithm was applied to obtain primary relations between all the parameters, and afterwards, the K-means clustering technique was implemented to corroborate those relations found previously and to find patterns in the data. PCA was also used on a per shift basis (morning, afternoon, night and early morning) to validate possible variation of the previous trends and a per year basis to verify that the identified trends have remained throughout the study time. Results demonstrated that wind speed, wind direction, temperature, and NO2 are the most influencing factors on PM10 concentrations. Furthermore, it was confirmed that high humidity episodes increased PM2,5 levels. It was also found that there are direct proportional relationships between O3 levels and wind speed and radiation, while there is an inverse relationship between O3 levels and humidity. Concentrations of SO2 increases with the presence of PM10 and decreases with the wind speed and wind direction. They proved as well that there is a decreasing trend of pollutant concentrations over the last five years. Also, in rainy periods (March-June and September-December) some trends regarding precipitations were stronger. Results obtained with K-means demonstrated that it was possible to find patterns on the data, and they also showed similar conditions and data distribution among Carvajal, Tunal and Puente Aranda stations, and also between Parque Simon Bolivar and las Ferias. It was verified that the aforementioned trends prevailed during the study period by applying the same technique per year. It was concluded that PCA algorithm is useful to establish preliminary relationships among variables, and K-means clustering to find patterns in the data and understanding its distribution. The discovery of patterns in the data allows using these clusters as an input to an Artificial Neural Network prediction model.Keywords: air pollution, air quality modelling, data mining, particulate matter
Procedia PDF Downloads 259504 A Novel Guided Search Based Multi-Objective Evolutionary Algorithm
Authors: A. Baviskar, C. Sandeep, K. Shankar
Abstract:
Solving Multi-objective Optimization Problems requires faster convergence and better spread. Though existing Evolutionary Algorithms (EA's) are able to achieve this, the computation effort can further be reduced by hybridizing them with innovative strategies. This study is focuses on converging to the pareto front faster while adapting the advantages of Strength Pareto Evolutionary Algorithm-II (SPEA-II) for a better spread. Two different approaches based on optimizing the objective functions independently are implemented. In the first method, the decision variables corresponding to the optima of individual objective functions are strategically used to guide the search towards the pareto front. In the second method, boundary points of the pareto front are calculated and their decision variables are seeded to the initial population. Both the methods are applied to different constrained and unconstrained multi-objective test functions. It is observed that proposed guided search based algorithm gives better convergence and diversity than several well-known existing algorithms (such as NSGA-II and SPEA-II) in considerably less number of iterations.Keywords: boundary points, evolutionary algorithms (EA's), guided search, strength pareto evolutionary algorithm-II (SPEA-II)
Procedia PDF Downloads 277503 Seismicity and Source Parameter of Some Events in Abu Dabbab Area, Red Sea Coast
Authors: Hamed Mohamed Haggag
Abstract:
Prior to 12 November 1955, no earthquakes have been reported from the Abu Dabbab area in the International Seismological Center catalogue (ISC). The largest earthquake in Abu Dabbab area occurred on November 12, 1955 with magnitude Mb 6.0. The closest station from the epicenter was at Helwan (about 700 km to the north), so the depth of this event is not constrained and no foreshocks or aftershocks were recorded. Two other earthquakes of magnitude Mb 4.5 and 5.2 took place in the same area on March 02, 1982 and July 02, 1984, respectively. Since the installation of Aswan Seismic Network stations in 1982, (250-300 km to the south-west of Abu Dabbab area) then the Egyptian Natoinal Seismic Network stations, it was possible to record some activity from Abu Dabbab area. The recorded earthquakes at Abu Dabbab area as recorded from 1982 to 2014 shows that the earthquake epicenters are distributed in the same direction of the main trends of the faults in the area, which is parallel to the Red Sea coast. The spectral analysis was made for some earthquakes. The source parameters, seismic moment (Mo), source dimension (r), stress drop (Δδ), and apparent stress (δ) are determined for these events. The spectral analysis technique was completed using MAG software program.Keywords: Abu Dabbab, seismicity, seismic moment, source parameter
Procedia PDF Downloads 462502 A Linear Programming Approach to Assist Roster Construction Under a Salary Cap
Authors: Alex Contarino
Abstract:
Professional sports leagues often have a “free agency” period, during which teams may sign players with expiring contracts.To promote parity, many leagues operate under a salary cap that limits the amount teams can spend on player’s salaries in a given year. Similarly, in fantasy sports leagues, salary cap drafts are a popular method for selecting players. In order to sign a free agent in either setting, teams must bid against one another to buy the player’s services while ensuring the sum of their player’s salaries is below the salary cap. This paper models the bidding process for a free agent as a constrained optimization problem that can be solved using linear programming. The objective is to determine the largest bid that a team should offer the player subject to the constraint that the value of signing the player must exceed the value of using the salary cap elsewhere. Iteratively solving this optimization problem for each available free agent provides teams with an effective framework for maximizing the talent on their rosters. The utility of this approach is demonstrated for team sport roster construction and fantasy sport drafts, using recent data sets from both settings.Keywords: linear programming, optimization, roster management, salary cap
Procedia PDF Downloads 111501 Using Technology to Deliver and Scale Early Childhood Development Services in Resource Constrained Environments: Case Studies from South Africa
Authors: Sonja Giese, Tess N. Peacock
Abstract:
South African based Innovation Edge is experimenting with technology to drive positive behavior change, enable data-driven decision making, and scale quality early years services. This paper uses five case studies to illustrate how technology can be used in resource-constrained environments to first, encourage parenting practices that build early language development (using a stage-based mobile messaging pilot, ChildConnect), secondly, to improve the quality of ECD programs (using a mobile application, CareUp), thirdly, how to affordably scale services for the early detection of visual and hearing impairments (using a mobile tool, HearX), fourthly, how to build a transparent and accountable system for the registration and funding of ECD (using a blockchain enabled platform, Amply), and finally enable rapid data collection and feedback to facilitate quality enhancement of programs at scale (the Early Learning Outcomes Measure). ChildConnect and CareUp were both developed using a design based iterative research approach. The usage and uptake of ChildConnect and CareUp was evaluated with qualitative and quantitative methods. Actual child outcomes were not measured in the initial pilots. Although parents who used and engaged on either platform felt more supported and informed, parent engagement and usage remains a challenge. This is contrast to ECD practitioners whose usage and knowledge with CareUp showed both sustained engagement and knowledge improvement. HearX is an easy-to-use tool to identify hearing loss and visual impairment. The tool was tested with 10000 children in an informal settlement. The feasibility of cost-effectively decentralising screening services was demonstrated. Practical and financial barriers remain with respect to parental consent and for successful referrals. Amply uses mobile and blockchain technology to increase impact and accountability of public services. In the pilot project, Amply is being used to replace an existing paper-based system to register children for a government-funded pre-school subsidy in South Africa. Early Learning Outcomes Measure defines what it means for a child to be developmentally ‘on track’ at aged 50-69 months. ELOM administration is enabled via a tablet which allows for easy and accurate data collection, transfer, analysis, and feedback. ELOM is being used extensively to drive quality enhancement of ECD programs across multiple modalities. The nature of ECD services in South Africa is that they are in large part provided by disconnected private individuals or Non-Governmental Organizations (in contrast to basic education which is publicly provided by the government). It is a disparate sector which means that scaling successful interventions is that much harder. All five interventions show the potential of technology to support and enhance a range of ECD services, but pathways to scale are still being tested.Keywords: assessment, behavior change, communication, data, disabilities, mobile, scale, technology, quality
Procedia PDF Downloads 135500 Convex Restrictions for Outage Constrained MU-MISO Downlink under Imperfect Channel State Information
Authors: A. Preetha Priyadharshini, S. B. M. Priya
Abstract:
In this paper, we consider the MU-MISO downlink scenario, under imperfect channel state information (CSI). The main issue in imperfect CSI is to keep the probability of each user achievable outage rate below the given threshold level. Such a rate outage constraints present significant and analytical challenges. There are many probabilistic methods are used to minimize the transmit optimization problem under imperfect CSI. Here, decomposition based large deviation inequality and Bernstein type inequality convex restriction methods are used to perform the optimization problem under imperfect CSI. These methods are used for achieving improved output quality and lower complexity. They provide a safe tractable approximation of the original rate outage constraints. Based on these method implementations, performance has been evaluated in the terms of feasible rate and average transmission power. The simulation results are shown that all the two methods offer significantly improved outage quality and lower computational complexity.Keywords: imperfect channel state information, outage probability, multiuser- multi input single output, channel state information
Procedia PDF Downloads 814499 Design of Personal Job Recommendation Framework on Smartphone Platform
Authors: Chayaporn Kaensar
Abstract:
Recently, Job Recommender Systems have gained much attention in industries since they solve the problem of information overload on the recruiting website. Therefore, we proposed Extended Personalized Job System that has the capability of providing the appropriate jobs for job seeker and recommending some suitable information for them using Data Mining Techniques and Dynamic User Profile. On the other hands, company can also interact to the system for publishing and updating job information. This system have emerged and supported various platforms such as web application and android mobile application. In this paper, User profiles, Implicit User Action, User Feedback, and Clustering Techniques in WEKA libraries have gained attention and implemented for this application. In additions, open source tools like Yii Web Application Framework, Bootstrap Front End Framework and Android Mobile Technology were also applied.Keywords: recommendation, user profile, data mining, web and mobile technology
Procedia PDF Downloads 313498 The Role of Initiator in the Synthesis of Poly(Methyl Methacrylate)-Layered Silicate Nanocomposites through Bulk Polymerization
Authors: Tsung-Yen Tsai, Naveen Bunekar, Ming Hsuan Chang, Wen-Kuang Wang, Satoshi Onda
Abstract:
The structure-property relationship and initiator effect on bulk polymerized poly(methyl methacrylate) (PMMA)–oragnomodified layered silicate nanocomposites was investigated. In this study, we used 2, 2'-azobis (4-methoxy-2,4-dimethyl valeronitrile and benzoyl peroxide initiators for bulk polymerization. The bulk polymerized nanocomposites’ morphology was investigated by X-ray diffraction and transmission electron microscopy. The type of initiator strongly influences the physiochemical properties of the polymer nanocomposite. The thermal degradation of PMMA in the presence of nanofiller was studied. 5 wt% weight loss temperature (T5d) increased as compared to pure PMMA. The peak degradation temperature increased for the nanocomposites. Differential scanning calorimetry and dynamic mechanical analysis were performed to investigate the glass transition temperature and the nature of the constrained region as the reinforcement mechanism respectively. Furthermore, the optical properties such as UV-Vis and Total Luminous Transmission of nanocomposites are examined.Keywords: initiator, bulk polymerization, layered silicates, methyl methacrylate
Procedia PDF Downloads 292497 Extracting Actions with Improved Part of Speech Tagging for Social Networking Texts
Authors: Yassine Jamoussi, Ameni Youssfi, Henda Ben Ghezala
Abstract:
With the growing interest in social networking, the interaction of social actors evolved to a source of knowledge in which it becomes possible to perform context aware-reasoning. The information extraction from social networking especially Twitter and Facebook is one of the problems in this area. To extract text from social networking, we need several lexical features and large scale word clustering. We attempt to expand existing tokenizer and to develop our own tagger in order to support the incorrect words currently in existence in Facebook and Twitter. Our goal in this work is to benefit from the lexical features developed for Twitter and online conversational text in previous works, and to develop an extraction model for constructing a huge knowledge based on actionsKeywords: social networking, information extraction, part-of-speech tagging, natural language processing
Procedia PDF Downloads 305496 Authentication Based on Hand Movement by Low Dimensional Space Representation
Authors: Reut Lanyado, David Mendlovic
Abstract:
Most biological methods for authentication require special equipment and, some of them are easy to fake. We proposed a method for authentication based on hand movement while typing a sentence with a regular camera. This technique uses the full video of the hand, which is harder to fake. In the first phase, we tracked the hand joints in each frame. Next, we represented a single frame for each individual using our Pose Agnostic Rotation and Movement (PARM) dimensional space. Then, we indicated a full video of hand movement in a fixed low dimensional space using this method: Fixed Dimension Video by Interpolation Statistics (FDVIS). Finally, we identified each individual in the FDVIS representation using unsupervised clustering and supervised methods. Accuracy exceeds 96% for 80 individuals by using supervised KNN.Keywords: authentication, feature extraction, hand recognition, security, signal processing
Procedia PDF Downloads 129495 Capacitated Multiple Allocation P-Hub Median Problem on a Cluster Based Network under Congestion
Authors: Çağrı Özgün Kibiroğlu, Zeynep Turgut
Abstract:
This paper considers a hub location problem where the network service area partitioned into predetermined zones (represented by node clusters is given) and potential hub nodes capacity levels are determined a priori as a selection criteria of hub to investigate congestion effect on network. The objective is to design hub network by determining all required hub locations in the node clusters and also allocate non-hub nodes to hubs such that the total cost including transportation cost, opening cost of hubs and penalty cost for exceed of capacity level at hubs is minimized. A mixed integer linear programming model is developed introducing additional constraints to the traditional model of capacitated multiple allocation hub location problem and empirically tested.Keywords: hub location problem, p-hub median problem, clustering, congestion
Procedia PDF Downloads 494494 Diagnose of the Future of Family Businesses Based on the Study of Spanish Family Businesses Founders
Authors: Fernando Doral
Abstract:
Family businesses are a key phenomenon within the business landscape. Nevertheless, it involves two terms (“family” and “business”) which are nowadays rapidly evolving. Consequently, it isn't easy to diagnose if a family business will be a growing or decreasing phenomenon, which is the objective of this study. For that purpose, a sample of 50 Spanish-established companies from various sectors was taken. Different factors were identified for each enterprise, related to the profile of the founders, such as age, the number of sons and daughters, or support received from the family at the moment to start it up. That information was taken as an input for a clustering method to identify groups, which could help define the founders' profiles. That characterization was carried as a base to identify three factors whose evolution should be analyzed: family structures, business landscape and entrepreneurs' motivations. The analysis of the evolution of these three factors seems to indicate a negative tendency of family businesses. Therefore the consequent diagnosis of this study is to consider family businesses as a declining phenomenon.Keywords: business diagnose, business trends, family business, family business founders
Procedia PDF Downloads 209493 Data Mining Techniques for Anti-Money Laundering
Authors: M. Sai Veerendra
Abstract:
Today, money laundering (ML) poses a serious threat not only to financial institutions but also to the nation. This criminal activity is becoming more and more sophisticated and seems to have moved from the cliché of drug trafficking to financing terrorism and surely not forgetting personal gain. Most of the financial institutions internationally have been implementing anti-money laundering solutions (AML) to fight investment fraud activities. However, traditional investigative techniques consume numerous man-hours. Recently, data mining approaches have been developed and are considered as well-suited techniques for detecting ML activities. Within the scope of a collaboration project on developing a new data mining solution for AML Units in an international investment bank in Ireland, we survey recent data mining approaches for AML. In this paper, we present not only these approaches but also give an overview on the important factors in building data mining solutions for AML activities.Keywords: data mining, clustering, money laundering, anti-money laundering solutions
Procedia PDF Downloads 539492 Simulation Approach for a Comparison of Linked Cluster Algorithm and Clusterhead Size Algorithm in Ad Hoc Networks
Authors: Ameen Jameel Alawneh
Abstract:
A Mobile ad-hoc network (MANET) is a collection of wireless mobile hosts that dynamically form a temporary network without the aid of a system administrator. It has neither fixed infrastructure nor wireless ad hoc sessions. It inherently reaches several nodes with a single transmission, and each node functions as both a host and a router. The network maybe represented as a set of clusters each managed by clusterhead. The cluster size is not fixed and it depends on the movement of nodes. We proposed a clusterhead size algorithm (CHSize). This clustering algorithm can be used by several routing algorithms for ad hoc networks. An elected clusterhead is assigned for communication with all other clusters. Analysis and simulation of the algorithm has been implemented using GloMoSim networks simulator, MATLAB and MAPL11 proved that the proposed algorithm achieves the goals.Keywords: simulation, MANET, Ad-hoc, cluster head size, linked cluster algorithm, loss and dropped packets
Procedia PDF Downloads 396491 Optimal Economic Restructuring Aimed at an Optimal Increase in GDP Constrained by a Decrease in Energy Consumption and CO2 Emissions
Authors: Alexander Vaninsky
Abstract:
The objective of this paper is finding the way of economic restructuring - that is, change in the shares of sectoral gross outputs - resulting in the maximum possible increase in the gross domestic product (GDP) combined with decreases in energy consumption and CO2 emissions. It uses an input-output model for the GDP and factorial models for the energy consumption and CO2 emissions to determine the projection of the gradient of GDP, and the antigradients of the energy consumption and CO2 emissions, respectively, on a subspace formed by the structure-related variables. Since the gradient (antigradient) provides a direction of the steepest increase (decrease) of the objective function, and their projections retain this property for the functions' limitation to the subspace, each of the three directional vectors solves a particular problem of optimal structural change. In the next step, a type of factor analysis is applied to find a convex combination of the projected gradient and antigradients having maximal possible positive correlation with each of the three. This convex combination provides the desired direction of the structural change. The national economy of the United States is used as an example of applications.Keywords: economic restructuring, input-output analysis, divisia index, factorial decomposition, E3 models
Procedia PDF Downloads 314490 Performance Prediction Methodology of Slow Aging Assets
Authors: M. Ben Slimene, M.-S. Ouali
Abstract:
Asset management of urban infrastructures faces a multitude of challenges that need to be overcome to obtain a reliable measurement of performances. Predicting the performance of slowly aging systems is one of those challenges, which helps the asset manager to investigate specific failure modes and to undertake the appropriate maintenance and rehabilitation interventions to avoid catastrophic failures as well as to optimize the maintenance costs. This article presents a methodology for modeling the deterioration of slowly degrading assets based on an operating history. It consists of extracting degradation profiles by grouping together assets that exhibit similar degradation sequences using an unsupervised classification technique derived from artificial intelligence. The obtained clusters are used to build the performance prediction models. This methodology is applied to a sample of a stormwater drainage culvert dataset.Keywords: artificial Intelligence, clustering, culvert, regression model, slow degradation
Procedia PDF Downloads 112489 Unsupervised Learning with Self-Organizing Maps for Named Entity Recognition in the CONLL2003 Dataset
Authors: Assel Jaxylykova, Alexnder Pak
Abstract:
This study utilized a Self-Organizing Map (SOM) for unsupervised learning on the CONLL-2003 dataset for Named Entity Recognition (NER). The process involved encoding words into 300-dimensional vectors using FastText. These vectors were input into a SOM grid, where training adjusted node weights to minimize distances. The SOM provided a topological representation for identifying and clustering named entities, demonstrating its efficacy without labeled examples. Results showed an F1-measure of 0.86, highlighting SOM's viability. Although some methods achieve higher F1 measures, SOM eliminates the need for labeled data, offering a scalable and efficient alternative. The SOM's ability to uncover hidden patterns provides insights that could enhance existing supervised methods. Further investigation into potential limitations and optimization strategies is suggested to maximize benefits.Keywords: named entity recognition, natural language processing, self-organizing map, CONLL-2003, semantics
Procedia PDF Downloads 50488 Forecasting the Influences of Information and Communication Technology on the Structural Changes of Japanese Industrial Sectors: A Study Using Statistical Analysis
Authors: Ubaidillah Zuhdi, Shunsuke Mori, Kazuhisa Kamegai
Abstract:
The purpose of this study is to forecast the influences of Information and Communication Technology (ICT) on the structural changes of Japanese economies based on Leontief Input-Output (IO) coefficients. This study establishes a statistical analysis to predict the future interrelationships among industries. We employ the Constrained Multivariate Regression (CMR) model to analyze the historical changes of input-output coefficients. Statistical significance of the model is then tested by Likelihood Ratio Test (LRT). In our model, ICT is represented by two explanatory variables, i.e. computers (including main parts and accessories) and telecommunications equipment. A previous study, which analyzed the influences of these variables on the structural changes of Japanese industrial sectors from 1985-2005, concluded that these variables had significant influences on the changes in the business circumstances of Japanese commerce, business services and office supplies, and personal services sectors. The projected future Japanese economic structure based on the above forecast generates the differentiated direct and indirect outcomes of ICT penetration.Keywords: forecast, ICT, industrial structural changes, statistical analysis
Procedia PDF Downloads 375487 Exploring the Nature and Meaning of Theory in the Field of Neuroeducation Studies
Authors: Ali Nouri
Abstract:
Neuroeducation is one of the most exciting research fields which is continually evolving. However, there is a need to develop its theoretical bases in connection to practice. The present paper is a starting attempt in this regard to provide a space from which to think about neuroeducational theory and invoke more investigation in this area. Accordingly, a comprehensive theory of neuroeducation could be defined as grouping or clustering of concepts and propositions that describe and explain the nature of human learning to provide valid interpretations and implications useful for educational practice in relation to philosophical aspects or values. Whereas it should be originated from the philosophical foundations of the field and explain its normative significance, it needs to be testable in terms of rigorous evidence to fundamentally advance contemporary educational policy and practice. There is thus pragmatically a need to include a course on neuroeducational theory into the curriculum of the field. In addition, there is a need to articulate and disseminate considerable discussion over the subject within professional journals and academic societies.Keywords: neuroeducation studies, neuroeducational theory, theory building, neuroeducation research
Procedia PDF Downloads 449486 Good Banks, Bad Banks, and Public Scrutiny: The Determinants of Corporate Social Responsibility in Times of Financial Volatility
Authors: A. W. Chalmers, O. M. van den Broek
Abstract:
This article examines the relationship between the global financial crisis and corporate social responsibility activities of financial services firms. It challenges the general consensus in existing studies that firms, when faced with economic hardship, tend to jettison CSR commitments. Instead, and building on recent insights into the institutional determinants of CSR, it is argued that firms are constrained in their ability to abandon CSR by the extent to which they are subject to intense public scrutiny by regulators and the news media. This argument is tested in the context of the European sovereign debt crisis drawing on a unique dataset of 170 firms in 15 different countries over a six-year period. Controlling for a battery of alternative explanations and comparing financial service providers to firms operating in other economic sectors, results indicate considerable evidence supporting the main argument. Rather than abandoning CSR during times of economic hardship, financial industry firms ramp up their CSR commitments in order to manage their public image and foster public trust in light of intense public scrutiny.Keywords: corporate social responsibility (CSR), public scrutiny, global financial crisis, financial services firms
Procedia PDF Downloads 307485 Smart Beta Portfolio Optimization
Authors: Saud Al Mahdi
Abstract:
Traditionally,portfolio managers have been discouraged from timing the market. This means, for example, that equity managers have been forced to adhere strictly to a benchmark with static or relatively stable components, such as the SP 500 or the Russell 3000. This means that the portfolio’s exposures to all risk factors should mimic as closely as possible the corresponding exposures of the benchmark. The main risk factor, of course, is the market itself. Effectively, a long-only portfolio would be constrained to have a beta 1. More recently, however, managers have been given greater discretion to adjust their portfolio’s risk exposures (in particular, the beta of their portfolio) dynamically to match the manager’s beliefs about future performance of the risk factors themselves. This freedom translates into the manager’s ability to adjust the portfolio’s beta dynamically. These strategies have come to be known as smart beta strategies. Adjusting beta dynamically amounts to attempting to "time" the market; that is, to increase exposure when one anticipates that the market will rise, and to decrease it when one anticipates that the market will fall. Traditionally, market timing has been believed to be impossible to perform effectively and consistently. Moreover, if a majority of market participants do it, their combined actions could destabilize the market. The aim of this project is to investigate so-called smart beta strategies to determine if they really can add value, or if they are merely marketing gimmicks used to sell dubious investment strategies.Keywords: beta, alpha, active portfolio management, trading strategies
Procedia PDF Downloads 357484 A Light in the Road of Protection of Civilians: Responsibility to Protect
Authors: Zeynep Selin Acar
Abstract:
In the world of wars, it is aimed to find ways to protect civilians propound by political elites. Current threats may come from edges of the security concerns, meaning uncontrollable terrorist groups, unanticipated government-supported armed groups or separatists, and unimaginable merge of the previous with foreign supports or oppositions of which could flow into all groups– flaws of international state system. These threats resulted in transformation of inter-state system into a world system with distinctive actors and brought along the changes in strategic plans of political and military bodies, as well as adaptations of principles framing the strategies in terms of may-be-applicable international law constrained by ethical considerations. This paper aims to analyse the Responsibility to Protect (RtoP), being one of those, with its criteria aiming to regulate military interventions taking the protection of civilians both as the reason for intervention, jus ad bellum or right to war, and as the duties during the intervention, jus in bello or how to conduct the war. In addition it will discuss the rise of its bindingness in terms of Responsibility Not to Veto (RNtoV), Franco/Mexican Political Declaration opened in signature for UN member states on September 2015.Keywords: civilian protection, protection as responsibility, responsibility to protect, responsibility not to veto
Procedia PDF Downloads 262483 The Shape Memory Recovery Properties under Load of a Polymer Composite
Authors: Abdul Basit, Gildas Lhostis, Bernard Durand
Abstract:
Shape memory polymers (SMPs) are replacing shape memory alloys (SMAs) in many applications as SMPs have certain superior properties than SMAs. However, SMAs possess some properties like recovery under stress that SMPs lack. SMPs cannot give complete recovery even under a small load. SMPs are initially heated close to their transition temperature (glass transition temperature or the melting temperature). Then force is applied to deform the heated SMP to a specific position. Subsequently, SMP is allowed to cool keeping it deformed. After cooling, SMP gets the temporary shape. This temporary shape can be recovered by heating it again at the same temperature that was given it while heating it initially. As a result, it will recover its original position. SMP can perform unconstrained recovery and constrained recovery, however; under the load, it only recovers partially. In this work, the recovery under the load of an asymmetrical shape memory composite called as CBCM-SMPC has been investigated. It is found that it has the ability to recover under different loads. Under different loads, it shows powerful complete recovery in reference to initial position. This property can be utilized in many applications.Keywords: shape memory, polymer composite, thermo-mechanical testing, recovery under load
Procedia PDF Downloads 439482 A QoE-driven Cross-layer Resource Allocation Scheme for High Traffic Service over Open Wireless Network Downlink
Authors: Liya Shan, Qing Liao, Qinyue Hu, Shantao Jiang, Tao Wang
Abstract:
In this paper, a Quality of Experience (QoE)-driven cross-layer resource allocation scheme for high traffic service over Open Wireless Network (OWN) downlink is proposed, and the related problem about the users in the whole cell including the users in overlap region of different cells has been solved.A method, in which assess models of the BestEffort service and the no-reference assess algorithm for video service are adopted, to calculate the Mean Opinion Score (MOS) value for high traffic service has been introduced. The cross-layer architecture considers the parameters in application layer, media access control layer and physical layer jointly. Based on this architecture and the MOS value, the Binary Constrained Particle Swarm Optimization (B_CPSO) algorithm is used to solve the cross-layer resource allocation problem. In addition,simulationresults show that the proposed scheme significantly outperforms other schemes in terms of maximizing average users’ MOS value for the whole system as well as maintaining fairness among users.Keywords: high traffic service, cross-layer resource allocation, QoE, B_CPSO, OWN
Procedia PDF Downloads 541481 Robust ResNets for Chemically Reacting Flows
Authors: Randy Price, Harbir Antil, Rainald Löhner, Fumiya Togashi
Abstract:
Chemically reacting flows are common in engineering applications such as hypersonic flow, combustion, explosions, manufacturing process, and environmental assessments. The number of reactions in combustion simulations can exceed 100, making a large number of flow and combustion problems beyond the capabilities of current supercomputers. Motivated by this, deep neural networks (DNNs) will be introduced with the goal of eventually replacing the existing chemistry software packages with DNNs. The DNNs used in this paper are motivated by the Residual Neural Network (ResNet) architecture. In the continuum limit, ResNets become an optimization problem constrained by an ODE. Such a feature allows the use of ODE control techniques to enhance the DNNs. In this work, DNNs are constructed, which update the species un at the nᵗʰ timestep to uⁿ⁺¹ at the n+1ᵗʰ timestep. Parallel DNNs are trained for each species, taking in uⁿ as input and outputting one component of uⁿ⁺¹. These DNNs are applied to multiple species and reactions common in chemically reacting flows such as H₂-O₂ reactions. Experimental results show that the DNNs are able to accurately replicate the dynamics in various situations and in the presence of errors.Keywords: chemical reacting flows, computational fluid dynamics, ODEs, residual neural networks, ResNets
Procedia PDF Downloads 121480 A Literature Review on the Role of Local Potential for Creative Industries
Authors: Maya Irjayanti
Abstract:
Local creativity utilization has been a strategic investment to be expanded as a creative industry due to its significant contribution to the national gross domestic product. Many developed and developing countries look toward creative industries as an agenda for the economic growth. This study aims to identify the role of local potential for creative industries from various empirical studies. The method performed in this study will involve a peer-reviewed journal articles and conference papers review addressing local potential and creative industries. The literature review analysis will include several steps: material collection, descriptive analysis, category selection, and material evaluation. Finally, the outcome expected provides a creative industries clustering based on the local potential of various nations. In addition, the finding of this study will be used as future research reference to explore a particular area with well-known aspects of local potential for creative industry products.Keywords: business, creativity, local potential, local wisdom
Procedia PDF Downloads 388479 The Role of Flowering Pesticidal Plants for Sustainable Pest Management
Authors: Baltazar Ndakidemi
Abstract:
The resource-constrained farmers, especially those in sub-Saharan Africa, encounter significant challenges related to agriculture, notably diseases and pests. The sustainable means of pest management are not well known to farmers. As a result, some farmers use synthetic pesticides whose environmental impacts, ill health, and other negative impacts of synthetic pesticides on natural enemies have posed a great need for more sustainable means of pest management. Pesticidal plant resources can replace synthetic pesticides because their secondary metabolites can exhibit insecticidal activities such as deterrence, repellence, and pests' mortality. Additionally, the volatiles from these plants can have positive effects of attracting populations of natural enemies. Pesticidal plants can be grown as field margin plants or in strips for supporting natural enemies' populations. However, this is practically undetermined. Hence, there is a need to investigate the roles played by pesticidal plants in supporting natural enemies of pests and their applications in different cropping systems such as legumes. This study investigates different pesticidal plants with a high potential for pest control in agricultural fields. The information sheds light on potential plants that can be used for different crop pests.Keywords: natural enemies, biological control, synthetic pesticides, pesticidal plants, predators, parasitoids
Procedia PDF Downloads 68