“The idea that there may be alternative technologies in itself implies the idea of technological pluralism in place of the until now almost universally accepted technological monism. In this case each social system and each political ideology, indeed each culture would be free to develop its own particular line. Why should there not be a specifically Indian technology alongside Indian art and why should the African temperament express itself only in music or sculpture and not in the equipment which Africans choose because it suits them better? Why should Russian factories follow Anglo-Saxon patterns? Might there not be an unmistakably Japanese technology, just as there are typically Japanese buildings and clothes?” (Robert Jungk, 1973)
Code and encodings are shaping the way we conceive and practise the work of reconstruction, conservation and representation of cultural artefacts. But is code culturally, socially and “politically” neutral? If code is made by humans, then must be subject to the law of history, and all “historical” activities are always also “political”. So our encodings, algorithms and softwares (let alone social media applications) seem to reflect a strong geopolitical bias. On June 4th we will be presenting at the ICTs & Society Conference in Vienna a paper on “The politics of code“. Digital humanists so far have been reluctant to analyse the socio-political nature (and impact) of their “technical” choices, so a conference on social sciences seems the ideal venue for starting to debate these issues. We will focus on three encoding tools widely used in the Humanities and Social Sciences communities: HTML, the de facto standard for encoding Web documents and pages, Unicode, an industry standard designed to allow text and symbols from all of the writing systems of the world to be consistently represented and manipulated by computers, and XML, which defines a set of rules for encoding documents.