Since 2003, members of DATEXIS contributed to various projects funded.

NOHATE: Overcoming crises in public communication about refugees, migration, foreigners (2018-2020). More and more frequently, discussions turn to so-called Hate Speech – offensive and hateful posts by individual users – when divisive issues come up in social media or online commentary sections. Without sufficient moderation, this hate speech can quickly lead to an escalation or inhibition of discussions. Therefore, operators of such platforms are advised to identify and moderate this hateful communication. However, due to the large amounts of data and rapidity of communication, this proves elusive. Especially the discussion about the accommodation of refugees since 2015 strengthened the assumption that hate speech in social media does not only pose a threat for individuals but also for society as a whole since hateful communication can be linked to the political advancement of right-wing extremist parties, political apathy, and racist crimes.

The three-year joint project NOHATE aims to analyse hateful communication on social media platforms, in online forums and commentary sections in order to identify underlying causes and dynamics as well as develop methods and software for (early) recognition of hateful communication and potential strategies for de-escalation. A case study will offer a multidimensional perspective on displacement and migration and provide data for software development.

The partners in the joint project are Freie Universität BerlinBeuth University of Applied Sciences Berlin and VICO Research & Consulting. The project is funded by the Federal Ministry of Education and Research (BMBF) within the context of the funding initiative "Strengthening solidarity in times of crises and change".

You can find further information about the project here.

H2020: FashionBrain - Understanding Europe's Fashion Data Universe (2017-2019).The primary goal of each retailer is to “understand your customers”. Our interviews with retailers show a primary demand from the retail industry for predicting a customer's next demand. Surprisingly , even a complete record of past purchases (and returns) is not sufficient to understand how items in a company's catalog do or do not connect with the customer's general tastes, lifestyle and aspirations. Moverover, from a business perspective, any efficiency gains in the logistics of supplier management, shipping and handling are rather minor, compared to the gains one could obtain from a better understanding of the customers’ personalities and habits. Given that the customer demands trigger proactive stocking and fashion production, this appears as a logical consequence.

In this project, we want to consolidate and extend existing European technologies in the area of database management, data mining, machine learning, image processing, information retrieval, and crowdsourcing to strengthen the positions of European fashion retailers among their world-wide competitors. Our choice for the fashion sector is a concise one: i) as a multi-billion euro industry, the fashion sector is extremely important for the European economy; ii) Europe already has a solid position in the world fashion stage, however, to maintain its position and keep up with the competitors, European fashion industry needs the help of advanced technology; and iii) European fashion industry provides an excellent exercise for new technologies, because it is a multi-sectorial by itself (i.e., imposes challenging data integration issues), it has a short life-cycle (i.e., requires timely reaction to the current events) and it involves diverse languages and cultures.

The main outcome of the FashionBrain project is the improvement of the fashion industry value chain obtained thanks to the creation of novel on-line shopping experiences, the detection of influencers, and the prediction of upcoming fashion trends. Tangible outcomes will include software, demonstrators, and novel algorithms for a data-driven fashion industry.

See more at CORDIS or at the FashionBrain Homepage

Smart-MD (2019/2020 BMWi). The aim of SmartMD is to convert medical case data from clinics into legally secure data products. For this purpose, a platform is being developed which will prepare the case data using AI technologies and make it available anonymously. The platform is to be used for Ada DX - a diagnostic support system for doctors. With Ada, the time to correct diagnosis can be shortened and misdiagnosis avoided. This leads to a higher quality of treatment and at the same time helps to conserve resources in the clinics. To demonstrate the clinical benefit, the application is being evaluated by the Charité and in the Helios clinics.

Partner: Ada Health GmbH, Charité, Helios Kliniken GmbH, Berliner Hochschule für Technik

Medical Allround-Care Service Solution (MACSS 2016-2019). Goal of MACSS is a protypical health plattform hosted with Charite Berlin. The plattform manages patient related data after a kidney surgery. Currently, this data is often still managed "on paper" and in silos. The MACSS plattform will integrate medical device indicators for vital data with text data from the patient's diary and the anamnesis. As a result, doctors at Charite but also in the field or medical staff in a kidney dialysis center might recieve a fresh and comprehensive picture about the patients conditions and can support the patient with asnychroneuous and prompt therapy adjustments. DATEXIS will focus on interactive text data management in a shared memory database in this project from 2016 till 2019. The research group DATEXIS acquired funding for one project in the Smart Service World programm. [More]

Germany: Berlin Big Data Center (BMBF 2013-2018). The Federal Ministry of Education and Research (BMBF) will establish new research on Big Data and IT security in Germany. The Technical University of Berlin (Prof. Volker Markl) leads one of the two competence centers founded by the BMBF. The Beuth University of Applied Science is full member of this initiative called "Berlin Big Data Center (BBDC). It is the only University of Applied Sciences in Germany that was elected to participate in this highly visible initiative of the BMBF. [more information]

Knowledge Web for the German Industry 2015-2018. The Smart Data Web project is lead by the DFKI Berlin (Hans Uszkoreit). This vision is to establish first a Knowledge Web, such as Freebase or DBPEDIA, but suitable for data value chains for Germans core industries, such as automotive, engineering and chemistry. Alexander Löser and Petra Sauer from the Beuth University are principal investigators in this prestigous project from 2015 to 2018 .  DATEXIS focuses on scalable in-data-base text mining in a shared nothing data base

ExCELL - Improve B2B Logistics im German Cities with Data 2015-2018. The ExCELL project (lead by FeldM consulting) has the goal of optiimizing free logistic resources, in particluar for transporting goods and for small and medium sized carriers and  customers. The team will analyze and unreavel common patterns in data from mobile devices, from communities and from public traffic sensors with the goal of providing new logistic services.  Petra Sauer and Alexander Löser from the Beuth University are principal investigators in this prestigous project from 2015 to 2018. 

BITKOM: Excellence in Big Data (2016) Germany's Industry and Government want to promote digitization in Germany and increase Germany's attractiveness for digital technologies. This means presenting Germany's expertise in key technology areas internationally. Bitkom, Smart Data Forum and Germany Trade & Invest have jointly presented the report "Germany – Excellence in Big Data", which aims to do this. The report is aimed at an international audience and presents more than 30 scientific organizations, over 60 technology vendors and more than 40 Big Data users with their research priorities, projects and strategies or products and services  Industry overviews complete the picture. This report provides interested parties with the most comprehensive overview of the Big Data landscape in Germany.

The  Data Science Group at Beuth University of Applied Sciences  is featured in the list of individual researchers and research groups at universities who are working on Big Data.

Germany: Smart Data Program (BMWi) Study(2015). The development of ICT for big data management is a logical priority of the Federal Ministry of Economics and Technology, and one which particularly merits support. This study evaluates this interplay in order to move, via management and analysis, from big data to smart data. [more information].Alexander Löser is one of the core authors of this study [download here].

Austria: Con querying Data - Study for BMVIT (2015). The aim of this FFG and BMVIT sponsored project is to conduct a roadmap study in the area of Intelligent Data Analytics in order to provide the grounds for the short-, medium- and long-term focus of FFG funding scheme  IKT der Zukunft: Daten durchdringen.

EU: ERCIM Expert Group "Big Data Analytics" (2014). The ERCIM consortium has approved an  ERCIM Expert Group on "Big Data Analytics", aimed at gathering experts on the subject and compile with their help a white paper that fosters a future agenda for European research on this direction. Alexander Löser is member of this expert group.

Global Corperations: IBM Global Business Services. IBM Global Servicesis the world's largest business and technology services provider. It employs over 190,000 people across more than 160 countries. IBM Global Services started in the spring of 1991, with the aim towards helping companies manage their IT operations and resources. Alexander Löser conducted training on Big Data Management Technology for more than 70 senior architects and partners of IBM GBS BAO in Berlin and London in 2013. He was also invited to present his vision on monetarizing "(Big) Data"at the IBM CIO Forum "Nürnberger Kreis" in 2013. [more information]

MIA (6 Million Euros, 2011-2014): The MIA reseGermany: Smart Data Program (BMWi) Study.

SCAPE (11 Million Euros, 2011-2014). The SCAPE enhances the state of the art of digital preservation in three ways: by developing infrastructure and tools for scalable preservation actions; by providing a framework for automated, quality-assured preservation workflows and by integrating these components with a policy-based preservation planning and watch system.

ROBUST (10 Million Euros, 2010-2013). ROBUST is targeted at developing methods to understand and manage the business, social and economic objectives of the users, providers and hosts and to meet the challenges of scale and growth in large communities. The outcome of ROBUST finds its application in online communities in internet, extranet and intranet settings addressing customer support, knowledge sharing, and hosting services.

DOPA (2,6 Million Euros, 2012-2014). DOPA - Data Supply Chains for Pools, Services and Analytics in Economics and Finance.DOPA will enable European SMEs to become key players in the global data economy, as the impacts of the DOPA RTD activities will likely materialize in both the supply and in the demand side of B2B vertical market segments of the data related services.

OKKAM (7 Million Euros, 2008-2010). The OKKAM project aims at enabling the Web of Entities, a virtual space where any collection of data and information about any type of entities (e.g. people, locations, organizations, events, artifacts, ...) published on the Web can be integrated into a single virtual, decentralized, open knowledge base, just like the Web did for hypertexts.

IBM SystemT. The SystemT project is an amalgam of two major research themes centered around analytics and search over unstructured content. These two themes are represented by two corresponding sub-projects: SystemT-Information Extraction (SystemT-IE) and SystemT-Programmable Search (SPS).

SWAP (2002 -2004). SWAP investigates how established benefits of ontologies may carry over to a decentralized, low-administration peer network. Knowledge management with SWAP is highly distributed, and participants may easily share knowledge. Furthermore, SWAP aggregates the conventional wisdom found at different peer nodes (taxonomies, folders) to extract shared views or mappings between views. Such emergent semantics methods make the peer-to-peer network searchable with precise semantic queries - something that current peer-to-peer technology lacks completely. Thus, SWAP creates a novel and promising technology.

EDUTELLA (2004 -2006).